Page MenuHomePhabricator

Does MediaWiki use more memory for something than it did before?
Open, LowPublic

Assigned To
None
Authored By
edwardspec
Mar 8 2013, 4:10 PM
Referenced Files
F10681: bug45900-2-mediawiki.log
Nov 22 2014, 1:30 AM
F10680: nginx_error.log
Nov 22 2014, 1:30 AM
F10679: bug45900-mediawiki.log
Nov 22 2014, 1:30 AM
F10678: memory_consuming_article.xml
Nov 22 2014, 1:30 AM

Description

Hello,
I've recently upgraded from MediaWiki 1.19 to MediaWiki 1.20.3.

I'm getting PHP errors "Allowed memory size of ... bytes exhaused" on some pages with a lot of templates (in includes/parser/Preprocessor_DOM.php on line 1029).

I understand that it's perfectly normal, however with MediaWiki 1.19 I had my memory_limit set to 64M, and it worked perfectly; I've increased this value to 128M but some pages still fail to render (the same pages that rendered OK before the upgrade with only 64M).

Maybe there were some changes to parser or preprocessor, which could account for this? A memory leak somewhere.

The pages in question are about 200K of wiki code and produce 1-2Mb of HTML, however they contain some heavy templates with >10 parameters each (multiple invocations of en.wikipedia.org/wiki/Template:Chess_diagram).

I have a feeling that every template invocation is being kept in memory, even when this is not really needed. If this is the issue, this has to be optimized.


Version: 1.20.x
Severity: normal

Details

Reference
bz45900

Event Timeline

bzimport raised the priority of this task from to Low.Nov 22 2014, 1:30 AM
bzimport set Reference to bz45900.
bzimport added a subscriber: Unknown Object (MLST).

Please provide more info about exact pages and the setup (database version etc). Also see http://www.mediawiki.org/wiki/Manual:How_to_debug

Created attachment 11929
An article that can't be rendered with 128M memory_limit (XML, for Special:Import)

Attached:

Created attachment 11930
The MediaWiki debug log

Attached:

Created attachment 11931
Nginx error log (with PHP message "Allowed memory size of ... bytes exhausted")

Attached:

The database is MySQL 5.5.

$wgMainCacheType = CACHE_ACCEL;
$wgParserCacheType = CACHE_DBA; # db4

Note the unusual amount of operations with images in mediawiki.log.

The template includes only two images. An the article includes this templates, say, 200 times. Looks like MediaWiki does something with them 400 times (instead of 2 times).

Created attachment 11932
The MediaWiki debug log for another page (which parsed successfully but took almost 64M)

Note the amount of memory used in Parser::braceSubstitution and PPFrame_DOM::expand.

Attached:

(a strategic suggestion)

Even if we assume that the memory usage is what it should be:

why should one increase the PHP memory_limit just because it is not enough to parse 10-20 pages on the wiki? (which all other pages requiring much less memory)

MediaWiki should predict the cases of high memory usage and handle it (by creating a temporary file, for example) instead of letting PHP crash.