Launch on-wiki page performance inspector
Open, NormalPublic



We want to make it easier for editors to know if there are things to be done to make the page faster for the user. Lets focus on things that an editor can change and give info about that.


Lets start by setting up a structure that works and add metrics one by one. Here's the list of things we can start with when we have the structure ip and running:

  • Show sizes for different modules and matching CSS (use what we have today in mw.loader.inspect())
  • HTML and image sizes. Warn for large images.
  • Show the backend time (how long time it takes for the server to serve the page) and compare with median time for all pages
  • Include the NewPP report T110763
  • Check if the the page in the slow parse log T98563
  • List of Lua module invocations + runtime (see T117173)
  • Estimate how long would it take this page to load on a 2G connection/test on WebPageTest?

More open tasks relating to Performance Inspector:

Related Objects

There are a very large number of changes, so older changes are hidden. Show Older Changes

I think the thinking should start from the user. The use case, how we expect them to use the information, etc. When asking what metrics should be used, I don't even know what the goal is.

On the top of my head I imagine editors being able to act on performance in the following ways:

  • 1 (save performance) Use alternative wikitext/templating/lua that is cheaper to process
  • 2 (page performance) Optimize the article content for performance (eg. avoid using custom thumbnail sizes, include less images in the article)
  • 3 (page performance) Optimize JS and CSS local to the wiki
  • 4 (page performance) Participate in mediawiki performance tasks

1 could be communicated while editing an article. At the very least some indication that this article is particularly slow to save, bonus points if we can point to a specific pain point (a specific template, or portion of lua code).

2 could also be communicated as hints/warnings during editing. Not sure that my initial examples about images are that strong, but it's an area worth exploring.

3 will be difficult to automate, I think, because it's difficult to isolate custom code added by users and analyze their performance impact. Maybe with synthetic testing we could compare between local JS/CSS being active and being neutered? But the reason people are writing custom code is to add features, and those will always take time to run. It would be annoying to always warn them that their code takes time, but of course it would be good to have safeguards against it having a disproportionate impact on performance.

4 is the only one where I think that exposing RUM about the article would make sense. It could be a good way to engage editors to report performance issues to phabricator at the very least. It would make them care about that issue. But I don't expect overall perf issues on an article to result in something actionable for editors to improve the performance themselves. RUM measurements about the whole article are too far removed from what editors have control over, It would be impossible to draw such a connection.

Gilles renamed this task from Make it easier for editors to find speed bootlenecks to Make it easier for editors to find speed bottlenecks.Nov 3 2015, 12:30 PM
Gilles set Security to None.
ori added a comment.Nov 8 2015, 6:50 PM
  • How many images are on the page? What is their total size?
  • What is the total size of the page HTML?
  • How long would it take this page to load on a 2G connection?
  • Ideally the interface would provide suggestions (cut down on images, etc.), or at least flag potential issues.

Things from mw.loader.inspect():

  • Which modules are loaded on the page? (+sizes)
  • Which style modules are loaded on the page (+sizes, % matching selectors)
  • What was the backend response time? How does it compare to the median?
  • Is the page in the slow parse logs? (see T98563)
  • All the data currently in the NewPP report in an HTML comment (see T110763)
  • List of Lua module invocations + runtime (see T117173)

Of course, we don't need all of these from day 1. We can start with just a couple of metrics, get feedback from users, and iterate.

Restricted Application added a subscriber: StudiesWorld. · View Herald TranscriptNov 8 2015, 6:50 PM
ori renamed this task from Make it easier for editors to find speed bottlenecks to On-wiki page performance inspector tool.Nov 9 2015, 1:46 AM
ori removed a subscriber: StudiesWorld.
Peter added a comment.Nov 9 2015, 8:44 AM

Cool. I've started with a rule system for that we can use or rather re-use the structure, meaning we can create our own rules, with limits and suggestions. It's almost like YSLow :) Actually it will be good and give us a nice way of structuring the code.

I'll start adding subtasks.

What kind of browser support should we have for this, work on all current I guess? Firefox nightly now support resource timings 2 so you could get asset size direct from resource timings, thats nice. Else we need to do an extra request per asset, but that's ok.

How long would it take this page to load on a 2G connection?

Slow, medium, fast? :)

Peter moved this task from Next-up to Doing on the Performance-Team board.Nov 9 2015, 8:45 AM
ori added a subscriber: Matanya.Nov 16 2015, 10:58 PM
Peter added a comment.Jan 12 2016, 8:39 AM

Adding this summary to make it easier for me to focus on the right things. Here's a summary of metrics/data to collect:

  • Use what's included in mw.loader.inspect()
  • Image size/number of images/size of the HTML
  • Include the NewPP report T110763
  • Backend time and how we do against median time
  • Is the page in the slow parse log T98563 (I need to discuss how to solve this in a performant way)?
  • List of Lua module invocations + runtime (see T117173)
  • How long would it take this page to load on a 2G connection/test on WebPageTest?
  • Metrics from nav timing/user timing/resource timing, in the future we could use these metrics to indicate that somethings is wrong/can be done better.

Then we need to display it:

  • At the top we should have a summary of suggestions of what to improve, so its easy to focus on what's most important.
  • List all different metrics

Other requirements:

  • Make it easy to add new metrics and make sure it can be picked up by the summary.
Peter triaged this task as Normal priority.Feb 10 2016, 7:48 AM
Peter updated the task description. (Show Details)Mar 22 2016, 6:45 PM
Peter moved this task from Doing to Backlog on the Performance-Team board.Mar 31 2016, 7:21 PM
Peter moved this task from Backlog to Next-up on the Performance-Team board.
Agabi10 added a subscriber: Agabi10.Jun 7 2016, 1:09 PM
Gilles moved this task from Next-up to Backlog on the Performance-Team board.Jul 12 2017, 7:18 PM
Imarlier moved this task from Backlog to Doing on the Performance-Team board.Jan 18 2018, 5:18 PM
Imarlier claimed this task.
Imarlier added a subscriber: Imarlier.

Needs followup with the Community Liasons team to figure out where/how to expose this when enabled on production.

Krinkle renamed this task from On-wiki page performance inspector tool to Launch on-wiki page performance inspector.
Imarlier moved this task from Doing to Blocked on the Performance-Team board.Feb 6 2018, 3:18 PM

Blocked on CL team. Which isn't a big deal, given that this ticket is sightly over 2 years old and they've only had a couple of days...

I've added an item to Tech/News that reads:

The Performance Inspector will be available as an opt-in user preference in the Editing section. It shows related information from different parts of MediaWiki for a specific article.

Please edit boldly to improve that if it is inaccurate or unclear. It will be frozen by early Friday UTC (so that translators have Fri/Sat/Sun to work on it).

Any help adding more details/examples/screenshots to the user help page would also be appreciated. I'll mark that up for translation once it is a bit more detailed.

Krinkle added a comment.EditedApr 25 2018, 8:13 PM

@Quiddity I was previously under the "Beta Feature" section of the Preferences on Beta Cluster. It is now going to under a new ""Developer Tools" heading of the "Editing" section instead, and also being being promoted from Beta Cluster to production.

The (otherwise unrelated) preference for "ParserMigration" extension was also moved to this Developer Tools section. For an overview of the Gerrit changes related to this, see T129322#4152371.

However, I'm not sure it should be in Tech News yet given PI is only on Beta Cluster at this time. I haven't seen a patch or schedule for production, although it does seem production-ready indeed. EDIT: Never mind, missed T129322#4158710.

@Krinkle Nod. Do you think any of that needs to be directly written in the Tech/News item? I didn't think it was crucial to understanding, hence left it out. Note also that Imarlier has kindly expanded the blurb so it has more details about the features of the tool.

Also you might be able to help with my specific questions about the docs, at T186260#3951557 - I think the page needs more details on what users can specifically do, for the Action sections. Possibly screenshots too. I could take&upload images, but I'm not sure which pages make good examples of the various aspects. Suggestions or direct edits appreciated.

Elitre added a subscriber: Elitre.May 2 2018, 9:39 AM
Imarlier removed Imarlier as the assignee of this task.May 9 2018, 4:20 PM

@Imarlier To add notes from 2018 offsite discussion.