Page MenuHomePhabricator

Yearly site performance reports (blog post)
Closed, ResolvedPublic


We should be able to publicly show the performance improvement/regressions in a good format to the rest of the world. It's a good way for us to share how we are doing and what we actually do. The reports need to hold in detail information but also not be too techie.

Check out Etsy's great performance reports for inspiration

High level of what todo:

  • Define what to include
  • Decide the format
  • First release date the end of Q3


Event Timeline

Peter raised the priority of this task from to Medium.
Peter updated the task description. (Show Details)
Peter added a project: Performance-Team.
Peter added a subscriber: Peter.

The Etsy reports are really good, I think following their pattern with a few changes will be great as a start. What about something like this: Long time goal should be to report median and 95th percentile.

  • Synthetic Front-End Performance - reporting SpeedIndex per browser and anonymous/logged in user and the baseline page. Should we also use other numbers so we are not forced to use WebPageTest?
  • Real User Front-end Performance - TBD
  • Server Side Performance - TBD
  • Extras: At first we report save performance and then add extra stats that will come out of T110121
Gilles renamed this task from Quarterly site performance reports to Yearly site performance reports.Dec 7 2016, 5:34 PM
Gilles raised the priority of this task from Medium to High.
Gilles set Security to None.
Krinkle renamed this task from Yearly site performance reports to Quarterly performance reports (blog post).Mar 23 2017, 8:57 PM
Krinkle moved this task from Backlog: Maintenance to Inbox on the Performance-Team board.

Change 344504 had a related patch set uploaded (by Krinkle):
[operations/mediawiki-config@master] StartProfiler: Disable production sampling for xhgui

Krinkle renamed this task from Quarterly performance reports (blog post) to Yearly site performance reports (blog post).Jun 14 2017, 7:31 PM
Krinkle lowered the priority of this task from High to Medium.Jul 6 2017, 5:35 AM

We mentioned this in the meeting last week. We're still interested in doing these, but we do need better stability in our metrics to be able to do this in a meaningful way, and to be strict about investigating regressions and improvements as they happen.

We'e been doing this for about 2-3 months, largely thanks to our new alerts. Reprioritising this for now, but we should consider this 1-2 quarters from now.

Imarlier raised the priority of this task from Medium to High.Jun 21 2018, 9:11 AM

Ref: the post that Gilles wrote last year.

It doesn't make sense to work on this until we've completed the Performance Perception survey and analysis -- that work will help us to identify a small number of metrics that we should be focusing on (and reporting on) rather than trying to address everything at once, or naively assuming that certain metrics are important without knowing for sure.

Done, per

For now, it covers individual projects and metric impacts, as opposed to the a regular reporting of key metrics. In part because T187684 is not yet resolved, but also because we lost the Graphite data for 2015/2016. I'm hoping after T187684 is resolved, starting with the 2017 edition we'll do year-range analysis of metrics as well.

Krinkle moved this task from Blocked (old) to Doing (old) on the Performance-Team board.