Page MenuHomePhabricator

Create a tracking category for pages which are slow to parse
Closed, DeclinedPublic

Description

If a page takes longer than N seconds to parse, add it to a hidden tracking category ('Slow pages' or whatever). Care must be taken not to inundate the category with entries in case of a partial outage causing all parse operations to be slow.

Details

Related Gerrit Patches:

Event Timeline

ori created this task.Aug 29 2015, 12:21 AM
ori assigned this task to aaron.
ori raised the priority of this task from to Needs Triage.
ori updated the task description. (Show Details)
ori added a project: MediaWiki-Parser.
ori added a subscriber: ori.
Restricted Application added a subscriber: Aklapper. · View Herald TranscriptAug 29 2015, 12:21 AM
ori set Security to None.

We may also want to redefine the current scope of "slow-parse".

Right now, if I understand correctly, slow-parse is only used when an article is parsed on-demand in a GET request (PoolWorkArticleView; presumably triggered when parser cache expired, or when logged-in users view articles with a user language other than the content language, or when the N+1th user views an article while the latest edit is still being parsed/saved).

However I imagine that in the majority of cases, articles are parsed in the POST request of the edit. And no matter how slow they parse, it never shows up in slow-parse. We may want to add slow-parse instrumentation to the general path as well.

Krinkle triaged this task as Medium priority.Sep 1 2015, 8:19 PM

Change 248400 had a related patch set uploaded (by Krinkle):
poolcounter: Add 'trigger' field to the slow-parse log

https://gerrit.wikimedia.org/r/248400

Change 248400 merged by jenkins-bot:
poolcounter: Add 'trigger' field to the slow-parse log

https://gerrit.wikimedia.org/r/248400

aaron added a comment.May 4 2016, 6:10 PM

This could be done via addTrackingCategory(), memcached tricks to avoid mass addition, along with checking if a page is already in the category to avoid mass removal.

I'm not sure what the end goal is or if there is a enough use case to justify this.

One related thing that might be more useful would be tracking tiers of render slowness (5-10sec,10-20,20+) in redis or something and warning (or blocking?) when pages would go into the next tier on edit.

aaron removed aaron as the assignee of this task.May 4 2016, 6:11 PM
aaron added a subscriber: aaron.
Gilles closed this task as Declined.Dec 6 2016, 3:36 PM
Gilles added a subscriber: Gilles.

Probably not that useful for users, identifying slow templates might be more worthwhile to pinpoint.

Restricted Application removed a subscriber: Liuxinyu970226. · View Herald TranscriptDec 6 2016, 3:36 PM