|Open||None||T255502 Goal: Save Timing median back under 1 second|
|Resolved||Krinkle||T277788 Save Timing improvements (2021-2022)|
|Resolved||Ladsgroup||T292300 Eliminate unnecessary duplicate parses (2021-2022)|
|Resolved||Ladsgroup||T288639 SpamBlacklistHooks::onEditFilterMergedContent causes every edit to be rendered twice|
|Resolved||• Pchelolo||T292302 CommonsMetadata extension causes every page on commons to be always parsed twice|
|Open||matej_suchanek||T264104 Verify AbuseFilter code that claims to share and re-use ParserOutput from core|
|Resolved||matmarex||T301309 Refreshlinks job is parsing pages twice|
|Resolved||Ladsgroup||T301310 CommonsMetadata extension is triggering a duplicate parse in commons|
- Mentioned In
- T157670: Periodically run refreshLinks.php on production sites.
T323068: Log spam: "MediaWiki\Parser\ParserObserver::notifyParse: Possibly redundant parse!" from DiscussionTools code
- Mentioned Here
- T277788: Save Timing improvements (2021-2022)
T299124: ProofreadPage frontend makes a request to the page before and after in every page view
T288707: Detect and monitor against multiple Parser invocation during edit requests
The proofread ones are actually false positive. The way it works is that ProofreadPage content handler's fillParserOutput basically prepends some wikitext to the page, creates a wikitext content handler and calls getParserOutput on that. Triggering error.
After lots of clean up being done you still see a lot of duplicate parses but a good look at it (which I did today) basically says most of them either useless or not important. Take for example the jobrunners which is 80% of the whole duplicate parses and 1K logs per minute.
It's coming from three jobs:
- CirrusSearch job
- Only happens on Wikidata and while being a lot, it doesn't produce HTML and doesn't call term store so it's cheap.
- It happens in small degree, 10 per minute. Mostly in commons.
- Rather smaller size, 300 per minute. Almost exclusively on commons, probably because of MCR so not a really big issue
We can chase the long tail but it's going to be hard and too little gain.