Page MenuHomePhabricator

Track size of page responses in Wikibase
Closed, ResolvedPublic

Assigned To
Authored By
Ladsgroup
Sep 10 2019, 3:41 PM
Referenced Files
F30907191: image.png
Oct 28 2019, 12:06 PM
F30877839: image.png
Oct 24 2019, 9:43 AM
F30290620: image.png
Sep 10 2019, 4:22 PM
F30290540: image.png
Sep 10 2019, 4:00 PM
F30290522: image.png
Sep 10 2019, 3:53 PM
F30290513: image.png
Sep 10 2019, 3:48 PM
F30290493: image.png
Sep 10 2019, 3:41 PM
Tokens
"Party Time" token, awarded by Rosalie_WMDE."Like" token, awarded by Jakob_WMDE.

Description

https://grafana.wikimedia.org/d/000000095/webpagetest-drilldown?orgId=1&var-wiki=wikidatawiki&var-users=anonymous&var-page=Berlin&var-location=us-east&var-browser=Chrome&var-view=firstView&from=now-2d&to=now Exists for Wikidata but:
1- It doesn't have uncompressed asset sizes
2- It's only Item of Berlin and not any special pages we want to improve (or lexemes)

Event Timeline

2MB of minifed js. This is absolutely crazy .

I have been tracking progress of startup module, the modules themselves, response times and etc. since wmf.22 for both Wikidata and commons. Here's some numbers:

  • The size of startup module has been consistently going down, everywhere. It went from 59.8 KB to 49.4KB (uncompressed but minified) in Wikidata and 59.9KB to 51.8KB (for commons, the same). This would give us 20GB reduction in network every week
  • The size of startup module in clients also went down for 2.1KB (uncompressed) which means reduction of 89GB every day
  • The time between request to startup module and the main modules has dropped consistently as well, ~1.1 Seconds for Wikidata and ~0.8 seconds for Commons It's because in every request the startup module will go through a dependency graph that's way simpler, that saves considerable CPU cycles in every request (even if everything is already cached) Also it does not include the time for the server to respond to startup module which also will improve because it doesn't need to build a complex graph anymore.
  • The actual size of javascript modules has increased for Wikidata and decreased for commons. I couldn't find any explanation for the increase. It can be some new frontend code (tainted refs) went live this week. The increase is only visible for this week. Even if assume all of the increase and decrease caused by our work, the total result is still positive: 103 GB reduction in network every week

@Ladsgroup I see this is done on the terminators board?
Should it be closed?

@Ladsgroup I see this is done on the terminators board?
Should it be closed?

Done