Page MenuHomePhabricator

Increase $wgExpensiveParserFunctionLimit on nowiki
Closed, DeclinedPublic


At nowiki we hit the limit for "Expensive parser function count", going to 504-506-508 in various configurations, but still missing about 1/8 of the total calls. For the moment we have very few pages with too many expensive parser functions, and this is neither a page with heavy traffic. It would although be nice to support the users that is utilizing data from Wikidata. The actual page is Paris–Nice 2017.

If possible we would like to increase the limit from 500 to 750, that is a 50% increase, to facilitate rendering of slightly heavier pages.

Local discussion is at w:no:Wikipedia:Torget#For mye Wikidata?

Event Timeline

Reedy added a subscriber: Reedy.

I suggest we increase it everywhere, just not one place.

Wants signoff from SRE and Performance-Team before doing

Yes, I believe that would be a better solution. ;)

Yes, I believe that would be a better solution. ;)

Saving a null revision on took a good 7-10 seconds to get back to the browser. That's pretty high already. I won't make a decision right here, but my first guess would be that our limits are already very reasonable and that we should instead focus our energy on making it less expensive to get the required information.

I have no doubt that the use cases from this page are entirely valid and reasonable. But I'd recommend looking into some of these options first:

  1. Are the templates performing multiple queries where a single query would work? E.g. instead of querying A from X and B from A, maybe it can query A+B from X in one query.
  2. Are the templates and Lua modules performing any unnecessary queries or computations? Maybe the Lua module supports a lot of information, but based on the template parameters only some of the pieces are used. Make sure the current parameters are taken into consideration in the loop where many queries are made..
  3. Is Lua and/or Wikibase caching the results for the same query within the same parsing invocation? Templates don't have shared context. Each of the 10-20 main templates on the page starts with the same queries but extracts different pieces from it. The duplicates should not add to the function count.
NewPP limit report
Parsed by mw1180
Cached time: 20170316211723
Cache expiry: 2592000
Dynamic content: false
CPU time usage: 7.048 seconds
Real time usage: 7.265 seconds
Preprocessor visited node count: 2760/1000000
Preprocessor generated node count: 0/1500000
Post‐expand include size: 204305/2097152 bytes
Template argument size: 28974/2097152 bytes
Highest expansion depth: 15/40
Expensive parser function count: 504/500
Lua time usage: 6.657/10.000 seconds
Lua memory usage: 13.98 MB/50 MB
Transclusion expansion time report (%,ms,calls,template)
100.00% 6940.593      1 -total
 36.61% 2540.989      8 Mal:Cycling_race/stageclassification
 35.68% 2476.197      8 Mal:Cycling_race/generalclassification
  5.97%  414.648      1 Mal:Cycling_race/infobox
  3.56%  247.233      1 Mal:Autoritetsdata
  1.48%  102.924      1 Mal:Sportslenker
  0.82%   56.744      1 Mal:Offisielt_nettsted
  0.80%   55.366      2 Mal:Navboks
  0.69%   47.927      2 Mal:Navboks/kjerne
  0.68%   47.203      1 Mal:Paris–Nice
<!-- Saved in parser cache with key nowiki:pcache:idhash:1486824-0!*!0!!en!4!* and timestamp 20170316211716 and revision id 17233009

The module is a Wikidata project, and performance issues should be directed there. As I said on IRC, I don't think the code is especially good, but I think it is better to support the users on getting this up and running.

Caching in Lua is a problem, as caching across invocations (require) is blocked.

Btw, the code at Wikidata can be found at d:Module:Cycling race.

Btw, the code at Wikidata can be found at d:Module:Cycling race.

It may be based on that, but the relevant wikis all have their own copy of it.

Looks like updating to the latest version from Wikidata might solve the problem. Some of the recent changes to that Lua module on mention "Less expensive function calls".

Yes before it is increased please see if other methods like getting the latest version of the module from Wikidata improves the situation.

fgiunchedi triaged this task as Medium priority.Apr 12 2017, 8:00 AM

I'm not going to do further followup on this, and it is only a single place they are stuck on this limit.
Will close as declined.