Page MenuHomePhabricator

MWException when viewing or comparing certain pages with Preprocessor_DOM (PHP7 beta feature)
Closed, ResolvedPublicPRODUCTION ERROR

Description

when seeing this diff https://fr.wikipedia.org/w/index.php?title=Saint-Girons_(Ariège)&diff=156875793&oldid=156712123

Some users get this kind of error (my result), some users don't

Error

Request ID: XG3fRwpAICEAAJEGYV8AAACI

message
[XG3fRwpAICEAAJEGYV8AAACI] 2019-02-20 23:14:17: Erreur fatale de type « MWException »

Impact

can't display the diff

Notes

Event Timeline

There are a very large number of changes, so older changes are hidden. Show Older Changes

[{exception_id}] {exception_url} MWException from line 162 of /srv/mediawiki/php-1.33.0-wmf.17/includes/parser/Preprocessor_DOM.php: Preprocessor_DOM::preprocessToObj: generated node count limit exceeded

Just in time to save the trace (will be expire from Logstash tomorrow).

message
MWException: Preprocessor_DOM::preprocessToObj: generated node count limit exceeded
trace
#0 /srv/mediawiki/php-1.33.0-wmf.17/includes/parser/Parser.php(2975): Preprocessor_DOM->preprocessToObj(string, integer)
#1 /srv/mediawiki/php-1.33.0-wmf.17/includes/parser/Parser.php(3581): Parser->preprocessToDom(string, integer)
#2 /srv/mediawiki/php-1.33.0-wmf.17/includes/parser/Parser.php(3334): Parser->getTemplateDom(Title)
#3 /srv/mediawiki/php-1.33.0-wmf.17/includes/parser/Preprocessor_DOM.php(1279): Parser->braceSubstitution(array, PPTemplateFrame_DOM)
#4 /srv/mediawiki/php-1.33.0-wmf.17/includes/parser/Parser.php(3396): PPFrame_DOM->expand(DOMElement)
#5 /srv/mediawiki/php-1.33.0-wmf.17/includes/parser/Preprocessor_DOM.php(1279): Parser->braceSubstitution(array, PPFrame_DOM)
#6 /srv/mediawiki/php-1.33.0-wmf.17/includes/parser/Parser.php(3036): PPFrame_DOM->expand(DOMElement, integer)
#7 /srv/mediawiki/php-1.33.0-wmf.17/includes/parser/Parser.php(1354): Parser->replaceVariables(string)
#8 /srv/mediawiki/php-1.33.0-wmf.17/includes/parser/Parser.php(482): Parser->internalParse(string)
#9 /srv/mediawiki/php-1.33.0-wmf.17/includes/content/WikitextContent.php(369): Parser->parse(string, Title, ParserOptions, boolean, boolean, integer)
#10 /srv/mediawiki/php-1.33.0-wmf.17/includes/content/AbstractContent.php(555): WikitextContent->fillParserOutput(Title, integer, ParserOptions, boolean, ParserOutput)
#11 /srv/mediawiki/php-1.33.0-wmf.17/includes/Revision/RenderedRevision.php(265): AbstractContent->getParserOutput(Title, integer, ParserOptions, boolean)
#12 /srv/mediawiki/php-1.33.0-wmf.17/includes/Revision/RenderedRevision.php(234): MediaWiki\Revision\RenderedRevision->getSlotParserOutputUncached(WikitextContent, boolean)
#13 /srv/mediawiki/php-1.33.0-wmf.17/includes/Revision/RevisionRenderer.php(193): MediaWiki\Revision\RenderedRevision->getSlotParserOutput(string)
#14 /srv/mediawiki/php-1.33.0-wmf.17/includes/Revision/RevisionRenderer.php(142): MediaWiki\Revision\RevisionRenderer->combineSlotOutput(MediaWiki\Revision\RenderedRevision, array)
#15 [internal function]: MediaWiki\Revision\RevisionRenderer->MediaWiki\Revision\{closure}(MediaWiki\Revision\RenderedRevision, array)
#16 /srv/mediawiki/php-1.33.0-wmf.17/includes/Revision/RenderedRevision.php(197): call_user_func(Closure, MediaWiki\Revision\RenderedRevision, array)
#17 /srv/mediawiki/php-1.33.0-wmf.17/includes/poolcounter/PoolWorkArticleView.php(194): MediaWiki\Revision\RenderedRevision->getRevisionParserOutput()
#18 /srv/mediawiki/php-1.33.0-wmf.17/includes/poolcounter/PoolCounterWork.php(123): PoolWorkArticleView->doWork()
#19 /srv/mediawiki/php-1.33.0-wmf.17/includes/page/WikiPage.php(1243): PoolCounterWork->execute()
#20 /srv/mediawiki/php-1.33.0-wmf.17/includes/diff/DifferenceEngine.php(922): WikiPage->getParserOutput(ParserOptions, integer)
#21 /srv/mediawiki/php-1.33.0-wmf.17/includes/diff/DifferenceEngine.php(883): DifferenceEngine->getParserOutput(WikiPage, Revision)
#22 /srv/mediawiki/php-1.33.0-wmf.17/includes/diff/DifferenceEngine.php(711): DifferenceEngine->renderNewRevision()
#23 /srv/mediawiki/php-1.33.0-wmf.17/includes/page/Article.php(932): DifferenceEngine->showDiffPage(boolean)
#24 /srv/mediawiki/php-1.33.0-wmf.17/includes/page/Article.php(616): Article->showDiffPage()
#25 /srv/mediawiki/php-1.33.0-wmf.17/includes/actions/ViewAction.php(68): Article->view()
#26 /srv/mediawiki/php-1.33.0-wmf.17/includes/MediaWiki.php(501): ViewAction->show()
#27 /srv/mediawiki/php-1.33.0-wmf.17/includes/MediaWiki.php(294): MediaWiki->performAction(Article, Title)
#28 /srv/mediawiki/php-1.33.0-wmf.17/includes/MediaWiki.php(867): MediaWiki->performRequest()
#29 /srv/mediawiki/php-1.33.0-wmf.17/includes/MediaWiki.php(517): MediaWiki->main()
#30 /srv/mediawiki/php-1.33.0-wmf.17/index.php(42): MediaWiki->run()
#31 /srv/mediawiki/w/index.php(3): require(string)
#32 {main}
Krinkle renamed this task from error while looking for diff depending on user to MWException when some users view or compare a certain articles on fr.wikipedia.org.Mar 19 2019, 7:36 PM
Krinkle updated the task description. (Show Details)

From the trace above we can see that the problem isn't with the diff but with the page generated below the diff. As such, the users in question are unable to view either the article nor the diffs for it.

Recorded 119 times in Logstash over the past 7 days. Seems to affect only fr.wikipedia.org.

Here's a sample from a page view:

  • Request ID: XJCxEApAME0AAETkbVYAAACV
  • Request URL: HTTP GET /wiki/Libourne on frwiki
trace
#0 /srv/mediawiki/php-1.33.0-wmf.21/includes/parser/Parser.php(2975): Preprocessor_DOM->preprocessToObj(string, integer)
#1 /srv/mediawiki/php-1.33.0-wmf.21/includes/parser/Parser.php(3581): Parser->preprocessToDom(string, integer)
#2 /srv/mediawiki/php-1.33.0-wmf.21/includes/parser/Parser.php(3334): Parser->getTemplateDom(Title)
#3 /srv/mediawiki/php-1.33.0-wmf.21/includes/parser/Preprocessor_DOM.php(1279): Parser->braceSubstitution(array, PPTemplateFrame_DOM)
#4 /srv/mediawiki/php-1.33.0-wmf.21/includes/parser/Parser.php(3396): PPFrame_DOM->expand(DOMElement)
#5 /srv/mediawiki/php-1.33.0-wmf.21/includes/parser/Preprocessor_DOM.php(1279): Parser->braceSubstitution(array, PPFrame_DOM)
#6 /srv/mediawiki/php-1.33.0-wmf.21/includes/parser/Parser.php(3036): PPFrame_DOM->expand(DOMElement, integer)
#7 /srv/mediawiki/php-1.33.0-wmf.21/includes/parser/Parser.php(1354): Parser->replaceVariables(string)
#8 /srv/mediawiki/php-1.33.0-wmf.21/includes/parser/Parser.php(482): Parser->internalParse(string)
#9 /srv/mediawiki/php-1.33.0-wmf.21/includes/content/WikitextContent.php(369): Parser->parse(string, Title, ParserOptions, boolean, boolean, integer)
#10 /srv/mediawiki/php-1.33.0-wmf.21/includes/content/AbstractContent.php(555): WikitextContent->fillParserOutput(Title, integer, ParserOptions, boolean, ParserOutput)
#11 /srv/mediawiki/php-1.33.0-wmf.21/includes/Revision/RenderedRevision.php(265): AbstractContent->getParserOutput(Title, integer, ParserOptions, boolean)
#12 /srv/mediawiki/php-1.33.0-wmf.21/includes/Revision/RenderedRevision.php(234): MediaWiki\Revision\RenderedRevision->getSlotParserOutputUncached(WikitextContent, boolean)
#13 /srv/mediawiki/php-1.33.0-wmf.21/includes/Revision/RevisionRenderer.php(193): MediaWiki\Revision\RenderedRevision->getSlotParserOutput(string)
#14 /srv/mediawiki/php-1.33.0-wmf.21/includes/Revision/RevisionRenderer.php(142): MediaWiki\Revision\RevisionRenderer->combineSlotOutput(MediaWiki\Revision\RenderedRevision, array)
#15 [internal function]: MediaWiki\Revision\RevisionRenderer->MediaWiki\Revision\{closure}(MediaWiki\Revision\RenderedRevision, array)
#16 /srv/mediawiki/php-1.33.0-wmf.21/includes/Revision/RenderedRevision.php(197): call_user_func(Closure, MediaWiki\Revision\RenderedRevision, array)
#17 /srv/mediawiki/php-1.33.0-wmf.21/includes/poolcounter/PoolWorkArticleView.php(194): MediaWiki\Revision\RenderedRevision->getRevisionParserOutput()
#18 /srv/mediawiki/php-1.33.0-wmf.21/includes/poolcounter/PoolCounterWork.php(123): PoolWorkArticleView->doWork()
#19 /srv/mediawiki/php-1.33.0-wmf.21/includes/page/Article.php(773): PoolCounterWork->execute()
#20 /srv/mediawiki/php-1.33.0-wmf.21/includes/actions/ViewAction.php(68): Article->view()
#21 /srv/mediawiki/php-1.33.0-wmf.21/includes/MediaWiki.php(501): ViewAction->show()
#22 /srv/mediawiki/php-1.33.0-wmf.21/includes/MediaWiki.php(294): MediaWiki->performAction(Article, Title)
#23 /srv/mediawiki/php-1.33.0-wmf.21/includes/MediaWiki.php(867): MediaWiki->performRequest()
#24 /srv/mediawiki/php-1.33.0-wmf.21/includes/MediaWiki.php(517): MediaWiki->main()
#25 /srv/mediawiki/php-1.33.0-wmf.21/index.php(42): MediaWiki->run()
#26 /srv/mediawiki/w/index.php(3): require(string)
#27 {main}

I suspect there may be user preferences involved that are causing the parser to act in a way that it is now exceeding a complexity limit we don't allow for wiki pages. The limitation is fine, but it seems problematic that we allow user A to save it, but then user B can't view it. I wonder what user preference (or user rights) could cause this?

@Krinkle, it's the PHP 7 beta feature that can be enabled in the preferences.

@Nirmos Thanks! I've confirmed this also when logged-out (by using PHP 7 via the WikimediaDebug browser extension), and purging the page (within PHP 7) to trigger a re-parse.

This rules out any aspects from the Beta Feature and WikimediaEvents-php7 JS code, and further confirms it is caused by only the factor of PHP 7.

Nirmos renamed this task from MWException when some users view or compare a certain articles on fr.wikipedia.org to MWException when viewing or comparing certain pages with PHP7 beta feature.Mar 19 2019, 9:14 PM

Interesting. Probably not actually related to PHP 7 but to Preprocessor_DOM -- I believe HHVM still runs Preprocessor_Hash. I've got an outstanding request to reduce the code duplication: T204945: Deprecate one of the Preprocessor implementations for 1.34.

That still doesn't answer the question as to why the two preprocessors are counting the nodes differently -- in theory the two implementations should have identical behavior. But it does answer the question of how user A (running on Preprocessor_Hash/HHVM) could save a page which user B (running Preprocessor_DOM/PHP 7) could not view.

cscott renamed this task from MWException when viewing or comparing certain pages with PHP7 beta feature to MWException when viewing or comparing certain pages with Preprocessor_DOM (PHP7 beta feature).Mar 21 2019, 8:58 PM
cscott added a subscriber: Parsing-Team--ARCHIVED.
cscott added a subscriber: tstarling.

Looking at the limit report of that page in hhvm shows me:

"ppgeneratednodes":{"value":0,"limit":1500000}

It seems hhvm fails to count the nodes and therefor never exceeds the limit

$this->parser->mGeneratedPPNodeCount += substr_count( $xml, '<' );

I have imported the page on my dev machine with a higher limit and it shows

"ppgeneratednodes":{"value":1883442,"limit":20000000}

But that's all from the template, the lua modules not imported yet

The problem is, that the max count of nodes does not work in hhvm and therefore it not blocked such pages

I have no further idea what to do on this task

  1. could the error replaced by a clear error which says what really happens on PHP 7 ?
  2. apparently the max count of nodes is broken on the current setup of mediawiki. Maybe it is not that useful ? Once the count is write, Probably it can't be fully removed, but can it be increase that it don't create problems.

Node counting was added to Preprocessor_DOM in 2caa7829fcc6a0ab45f91c4346c0d5a9100ef4dc by @tstarling. It's not strictly needed in Preprocessor_Hash (as I understand it) because libdom allocates memory in a different way than used by Preprocessor_Hash, but perhaps it should be implemented for consistency -- so that we don't get pages which are only editable if you use Preprocessor_Hash not Preprocessor_DOM.

Alternatively we should use Preprocessor_Hash on both HHVM and PHP 7 and deprecate/remove Preprocessor_DOM (see T204945: Deprecate one of the Preprocessor implementations for 1.34).

@cscott So, does that mean T204945 is no longer blocked on PHP7 migration or the removal of HHVM? Is T219901 blocked on anything?

I was sort of waiting to see if @tstarling had anything to say about this, as (a) he wrote both Preprocessor_DOM and Preprocessor_Hash (AFAIK) and (b) he was the one objection to the previous plan (deprecating _Hash in favor of _DOM).

But, no, I don't know of any blockers for either task. I was just waiting to write a patch to see if anyone would loudly object first.

Change 502563 had a related patch set uploaded (by C. Scott Ananian; owner: C. Scott Ananian):
[mediawiki/core@master] WIP DNP BAD IDEA: count PPNodes in Preprocessor_Hash

https://gerrit.wikimedia.org/r/502563

Change 502567 had a related patch set uploaded (by C. Scott Ananian; owner: C. Scott Ananian):
[operations/mediawiki-config@master] Default to Preprocessor_Hash on both PHP7 and HHVM

https://gerrit.wikimedia.org/r/502567

I'd recommend merge of 502567 ^. @Krinkle, if you agree C+1 and we can get this scheduled for SWAT?

I'd recommend merge of 502567 ^. @Krinkle, if you agree C+1 and we can get this scheduled for SWAT?

I support the idea of using the same on PHP 7 as what we already have been using on HHVM.

However, I can't speak to the combination of PHP 7 and Preprocessor_Hash specifically, as I really don't know this code. I'd prefer @tstarling review that instead.

Given all the PHP memory/speed enhancements, some benchmarking is probably in order. Maybe BenchmarkParse could use a preprocess-only option. I'm curious how PHP7 compares to HHVM for large pages.

Change 502567 merged by jenkins-bot:
[operations/mediawiki-config@master] Default to Preprocessor_Hash on both PHP7 and HHVM

https://gerrit.wikimedia.org/r/502567

The good news is, the generated node count limit exceeded error we saw from PHP 7 is no longer seen in production since the above patch.

The bad news is, at least some of these pages (possibly all) still don't render properly. They instead now fail with a lower-level error about the memory limit being exceeded. This can be reproduced with the original example by using WikimediaDebug and mwdebug1002+php7.

PHP Fatal error: Allowed memory size of 692060160 bytes exhausted (tried to allocate 20480 bytes) in /srv/mediawiki/php-1.34.0-wmf.3/includes/parser/Preprocessor_Hash.php on line 123.

The same still passes on HHVM on mwdebug1002, which is interesting. Does this mean HHVM is using less memory for the same task than PHP 7? Or maybe it's measuring/enforcing it differently?

@Krinkle should we raise the memory limit slightly? I can do some tests, but apart from that what remains to be done?

This is a blocker for the deployment of php7 in production, AIUI.

Krinkle triaged this task as High priority.EditedMay 7 2019, 4:22 PM

@Joe I don't know. I defer to Parsing Team and/or Tim Starling on this component. I don't know whether this is expected to consume more memory on PHP7 or what the cause/impact might be here.

The fundamental issue is that pages last saved on HHVM must be readable and editable on PHP 7, which currently isn't the case.

@cscott my understanding is now that the latest patch is in Parsing needs to look at this again. Are you the right person to pick it up?

Okay, I'll chat with Scott and Tim about this and figure out who can tackle this.

Does this mean HHVM is using less memory for the same task than PHP 7? Or maybe it's measuring/enforcing it differently?

This would be my first guess, yes. Unfortunately, we'll need to work to understand exactly what's going on here before we have confidence that "increasing memory limit by X" will be sufficient to solve the problem.

Just did some tests in production. The aforementioned page will render fine if I raise the memory limit to 2 GB. It also takes 25 seconds to render vs the 5 seconds it takes with HHVM.

It's clear we're meeting some pathological path in the php7 VM.

As for understanding what's going on, I'm trying to collect low-level data, but given we can't inspect what's going on in php7 via perf, it would probably be useful to check a profile of a request from the PHP POV.

from the original request's parsing report:

PHP 7.2

{"cputime":"24.524","walltime":"25.193","ppvisitednodes":{"value":50395,"limit":1000000},"ppgeneratednodes":{"value":0,"limit":1500000},"postexpandincludesize":{"value":280479,"limit":2097152},"templateargumentsize":{"value":55158,"limit":2097152},"expansiondepth":{"value":23,"limit":40},"expensivefunctioncount":{"value":9,"limit":500},"unstrip-depth":{"value":1,"limit":20},"unstrip-size":{"value":52357,"limit":5000000},"entityaccesscount":{"value":1,"limit":400},"timingprofile":["100.00% 24426.061      1 -total"," 91.03% 22234.437      1 Modèle:Tableau_rang_commune_de_France"," 14.17% 3461.455      8 Modèle:D-Nbcom"," 12.84% 3136.004      1 Modèle:Données_Rang2013"," 12.58% 3073.871      1 Modèle:Données_Rang2009"," 11.53% 2817.202      1 Modèle:Données_Rang2006","  9.49% 2318.452      1 Modèle:Données_Rang1999","  8.47% 2067.674      1 Modèle:Données_Rang1990","  7.89% 1927.185      1 Modèle:Données_Rang1975","  7.48% 1827.322      1 Modèle:Données_Rang1982"]},"scribunto":{"limitreport-timeusage":{"value":"0.496","limit":"10.000"},"limitreport-memusage":{"value":10224907,"limit":52428800},"limitreport-logs":"m²\t32\nkm²\t32\n"}

vs HHVM

{"limitreport":{"cputime":"4.404","walltime":"5.181","ppvisitednodes":{"value":50395,"limit":1000000},"ppgeneratednodes":{"value":0,"limit":1500000},"postexpandincludesize":{"value":280479,"limit":2097152},"templateargumentsize":{"value":55158,"limit":2097152},"expansiondepth":{"value":22,"limit":40},"expensivefunctioncount":{"value":9,"limit":500},"unstrip-depth":{"value":1,"limit":20},"unstrip-size":{"value":52357,"limit":5000000},"entityaccesscount":{"value":1,"limit":400},"timingprofile":["100.00% 4844.654      1 -total"," 66.05% 3200.061      1 Modèle:Tableau_rang_commune_de_France"," 11.16%  540.643      1 Modèle:Infobox_Commune_de_France"," 11.02%  533.866      1 Modèle:Infobox_Subdivision_administrative","  9.67%  468.373      1 Modèle:Données_Rang1968","  9.31%  451.182      1 Modèle:Données_Rang1975","  8.66%  419.316      1 Modèle:Données_Rang1982","  7.60%  368.230      1 Modèle:Données_Rang1990","  7.17%  347.524      1 Modèle:Population_de_France/section","  6.91%  334.755      1 Modèle:Données_Rang2013"]},"scribunto":{"limitreport-timeusage":{"value":"0.403","limit":"10.000"},"limitreport-memusage":{"value":10143875,"limit":52428800},"limitreport-logs":"m²\t32\nkm²\t32\n"},

it should be noted that even with the parsercache populated, php7 takes ~ 25/30 seconds to render the page, while HHVM takes around 5. So I suspect this problem is not necessarily tied to reparsing what's in the parsercache.

More interesting: I raised the memory limit on mwdebug1002 and got profiling information from both hhvm and php7 and the main difference is the memory spent in Preprocessor_Hash::preprocessToObj:

  • 1 GB in php7
  • 220 MB in HHVM

The timings are also very different, but not as clear in pointing out where the problem might be.

Same behaviour can be seen on the other page @Krinkle mentioned, https://fr.wikipedia.org/wiki/Libourne: see the profiling data for HHVM vs PHP7, which makes me think the ill behaviour is caused by a template both pages include. The fact that the problem is limited to frwiki also supports this hypothesis.

I would suggest that the problem probably lies in https://fr.wikipedia.org/wiki/Mod%C3%A8le:Tableau_rang_commune_de_France which seems to do some lookups on large tables contained in other objects (I'm not proficient with wikitext sorry).

And indeed, this page I just created containing only the use of that template:

https://fr.wikipedia.org/wiki/Utilisateur:GLavagetto_(WMF)/test_france

causes the same pathological behaviour on PHP7:

https://performance.wikimedia.org/xhgui/run/view?id=5cd54165bb8544ef501f1ed9

while I think we should try to fix this bug, it's probably a good idea if someone more proficient than me with wikitext thinks of a way to fix that template.

Going deeper, this template looks up on this and other tables:

https://fr.wikipedia.org/w/index.php?title=Mod%C3%A8le:Donn%C3%A9es_Nbcom2006

that are huge k-v lists

Reedy closed this task as Resolved.EditedMay 10 2019, 3:38 PM
Reedy claimed this task.
Reedy subscribed.

Claiming as resolved. I migrated all of the problem Templates on frwiki to lua modules which are much more performant

For example:
https://fr.wikipedia.org/wiki/Module:Donn%C3%A9es_Rang2013 created to take the data that was in wikitext in https://fr.wikipedia.org/wiki/Mod%C3%A8le:Donn%C3%A9es_Rang2013 and the latter switched to using the Module from the first part

This was done for all https://fr.wikipedia.org/wiki/Sp%C3%A9cial:Index/Mod%C3%A8le:Donn%C3%A9es_Rang and https://fr.wikipedia.org/wiki/Sp%C3%A9cial:Index/Mod%C3%A8le:Donn%C3%A9es_Nbcom

Followup asks to be filed to work out why there was such a memory difference in the first place, to see if this can be mitigated/improved in the future

More interesting: I raised the memory limit on mwdebug1002 and got profiling information from both hhvm and php7 and the main difference is the memory spent in Preprocessor_Hash::preprocessToObj:

  • 1 GB in php7
  • 220 MB in HHVM

Yeah, it's interesting that the various ppvisitednodes etc counts confirm that HHVM and PHP seem to be doing exactly the same work; ie, there's no weird PHP-7 behavior which is causing it to generate a subtly different graph. Assuming that the generated objects are exactly the same (which again, the counts seem to confirm), slight representation differences for object *probably* wouldn't account for such a large difference. So here are my two theories:

  1. (most likely, I think) PHP7 has a different GC/cycle-counting algorithm than HHVM, which is causing it to hold on to temporary objects for much longer. I think all the would be required would be for it to release a large temporary *after* a recursive call instead of *before*, and that could snowball with a large data structure. This is most likely and sadly probably hardest to fix, although we could probably work around it by explicitly setting some variable to null at an appropriate time.
  1. (less likely) PHP7 uses a different memory layout for some small but oft-repeated object, which is snowballing. @Joe mentioned "huge K-V lists" -- if a single KV item takes nominally (making up numbers) 7 bytes and HHVM rounds that to an 8-byte allocation while PHP7 rounds it to a 1024-byte allocation, this probably wouldn't have much effect at all.... until you have 10,000 of them on a single page. It's possible in that case that simply changing the way some object is represented could "fix" the problem.

There is a difference in the counts.

php7 says "expansiondepth":{"value":23,"limit":40}, while hhvm says "expansiondepth":{"value":22,"limit":40} and saves one depth. The deep of the stack can cause a high difference in memory usage, but the other counts does not indicate why or who it goes deeper

When making the timingprofile of php7 readable:

"100.00% 24426.061      1 -total",
" 91.03% 22234.437      1 Modèle:Tableau_rang_commune_de_France",
" 14.17% 3461.455      8 Modèle:D-Nbcom",
" 12.84% 3136.004      1 Modèle:Données_Rang2013",
" 12.58% 3073.871      1 Modèle:Données_Rang2009",
" 11.53% 2817.202      1 Modèle:Données_Rang2006",
"  9.49% 2318.452      1 Modèle:Données_Rang1999",
"  8.47% 2067.674      1 Modèle:Données_Rang1990",
"  7.89% 1927.185      1 Modèle:Données_Rang1975",
"  7.48% 1827.322      1 Modèle:Données_Rang1982"

the timingprofile of hhvm

"100.00% 4844.654      1 -total",
" 66.05% 3200.061      1 Modèle:Tableau_rang_commune_de_France",
" 11.16%  540.643      1 Modèle:Infobox_Commune_de_France",
" 11.02%  533.866      1 Modèle:Infobox_Subdivision_administrative",
"  9.67%  468.373      1 Modèle:Données_Rang1968",
"  9.31%  451.182      1 Modèle:Données_Rang1975",
"  8.66%  419.316      1 Modèle:Données_Rang1982",
"  7.60%  368.230      1 Modèle:Données_Rang1990",
"  7.17%  347.524      1 Modèle:Population_de_France/section",
"  6.91%  334.755      1 Modèle:Données_Rang2013"

It shows that each parser/preprocessor taken the time in difference templates, unfortunally the list is not complete, so it is not possible to see if there using the same templates throught the whole parse process.

The 8 for Modèle:D-Nbcom (a very small template) describe the calls to the scopedProfiling in Parser::braceSubstitution when a template is loaded. That match how often the template is used in Modèle:Tableau_rang_commune_de_France
hhvm needs also to process this template 8 times, but it can process it much faster, than php7
In Parser::getTemplateDOM is a cache, so there is no reason why it should do that faster in a notable way.

expansionDepth is a static variable in PPFrame_Hash::expand() -- as such it could be incremented by 1 by some PHP7-specific parse earlier in the process lifetime. For example, in the header which says "this is an experimental service running PHP7" or some other bit of wikitext which appears only on the PHP7 machines. I think that's a red herring.

@Reedy, I've just spotted a conversation on a wiki: that change is breaking some stuff ( displaying "Erreur Lua : not enough memory" at the bottom of some articles), and it is apparently due to templates you've created. I haven't investigated more deeply.

@Reedy, I've just spotted a conversation on a wiki: that change is breaking some stuff ( displaying "Erreur Lua : not enough memory" at the bottom of some articles), and it is apparently due to templates you've created. I haven't investigated more deeply.

It's entirely possible. They have a *lot* of data in some of them, such as https://fr.wikipedia.org/wiki/Module:Donn%C3%A9es_Rang1990 which is 900KB+ in raw text, and has to be loaded into memory. It was definitely a lot quicker (on PHP7 the page reported originally didn't even load at all, which is more of an issue as we couldn't move forward)

The fact they worked in the parserfunction versions is somewhat of a miracle...

But I guess it's not exactly helped by apparently ever single one of them actually being called/used on one page really doesn't help...

I guess, part of the problem comes is that we've shifted CPU for memory... A MW request can use 660MB, but Scribunto is limited to 50MB. It could be worth increasing the limit to see how far it helps. Will file a task

Yeah, it's interesting that the various ppvisitednodes etc counts confirm that HHVM and PHP seem to be doing exactly the same work; ie, there's no weird PHP-7 behavior which is causing it to generate a subtly different graph. Assuming that the generated objects are exactly the same (which again, the counts seem to confirm), slight representation differences for object *probably* wouldn't account for such a large difference. So here are my two theories:

  1. (most likely, I think) PHP7 has a different GC/cycle-counting algorithm than HHVM, which is causing it to hold on to temporary objects for much longer. I think all the would be required would be for it to release a large temporary *after* a recursive call instead of *before*, and that could snowball with a large data structure. This is most likely and sadly probably hardest to fix, although we could probably work around it by explicitly setting some variable to null at an appropriate time.
  1. (less likely) PHP7 uses a different memory layout for some small but oft-repeated object, which is snowballing. @Joe mentioned "huge K-V lists" -- if a single KV item takes nominally (making up numbers) 7 bytes and HHVM rounds that to an 8-byte allocation while PHP7 rounds it to a 1024-byte allocation, this probably wouldn't have much effect at all.... until you have 10,000 of them on a single page. It's possible in that case that simply changing the way some object is represented could "fix" the problem.

Did we file a task to look into the differences between hhvm and php?

Some volunteers are looking for transferring the enormous datafiles from French Wikipedia to Wikidata.

mmodell changed the subtype of this task from "Task" to "Production Error".Aug 28 2019, 11:07 PM