Page MenuHomePhabricator

Increase the « Post‐expand include size » process up to 2.5 MB
Open, Needs TriagePublic

Description

Hello

Some wiki have more and more page in « Category:Pages where template include size is exceeded ».
the Template are there to help, to arrange the pages, the articles.
the "Post-expand include size" process therefore limits the use of templates
I wish the limit could be increased from 2MB (which would seem to be the case) to 2.5MB

Thanks

Event Timeline

Hi, well, what is the methodology behind how you picked 2.5MB? Why not 2500MB or something else? I think the proper solution to slow software is to make it faster, not just to raise limits. So this task might get declined. See also T181907.

2.5 Mb, it's just up +25 %.
Over time the pages will be bigger and bigger. So both to raise the value gradually. This is what happened on Commons about the possible import size. It went gradually. The goal is to have pages that can be made normally without being greedy resource, while remaining usable.

it's alreaday on :
Commons - 112 pages
mediawiki - 1 page
metawiki - 840 pages
wikidata - 174 pages

on wikipedias :
ar - 92 pages
de - 60 pages
da - 36 pages
en - 493 pages
fr - 81 pages
ja - 333 pages
ru - 53 pages

and the trend will not go down. This will therefore pose more and more of a problem

2.5 Mb, it's just up +25 %.

Errm, no. Please see T189108#4031827 again. :)

I don't understand the "link" with the "make software faster" or "slower"

This would require a performance review first (not sure which project tag that implies).

This amounts to a request to increase the default $wgMaxArticleSize of 2048 (kilobytes) on production wikis. It was authored Feb 21 2006, 7:55 PM in rSVN13070 by timstarling (now @tstarling). Much has changed in user environments in the 16 years since then.

This amounts to a request to increase the default $wgMaxArticleSize of 2048 (kilobytes) on production wikis.

Please see T325836#8495451 - thanks.

What should be done is not increasing the post-expand include size to some random size, but moving some limits to be symbol-based and not byte-based. Currently non-ASCII wikis have much lower limits because, naturally, texts in their languages have to use Unicode (Кот is 6 bytes and Cat is 3 bytes, for example).

This is currently be used as a reason to avoid certain types of edits on larger articles on English Wikipedia, particularly the Donald Trump article, which was brought up in this discussion: https://en.wikipedia.org/wiki/Wikipedia_talk:Citing_sources#Talk:Donald_Trump_and_using_WP:LOCALCON_to_disallow_citation_archives

I'd actually recommend just doubling the current limit, as render times/speeds have become much more manageable, and it seemed from my brief research on this topic that the limit was meant to stop denial-of-service attacks (larger, more complex pages being deliberately used to overwhelm MediaWiki). But the amount of real-time used has been reduced significantly for page rendering, so this limit should go up to allow for more complex pages to render correctly (and still be within the original real-time limits when this was put into place).

Since the creation of the ticket according to Moore's law the available resources on client and server site should have been increased by a sufficient factor to double the limits.

Since the creation of the ticket according to Moore's law the available resources on client and server site should have been increased by a sufficient factor to double the limits.

It's actually an even older setting than that according to the comment quoted below, so I'd posit you could do even more than double but unless someone thinks 4096 is too low, I think doubling it would allow a lot more flexibility for editors:

This amounts to a request to increase the default $wgMaxArticleSize of 2048 (kilobytes) on production wikis. It was authored Feb 21 2006, 7:55 PM in rSVN13070 by timstarling (now @tstarling). Much has changed in user environments in the 16 years since then.

It's been over 18 years now.