Page MenuHomePhabricator

Fatal error "request has exceeded memory limit" from Echo DiscussionParser
Open, Needs TriagePublic

Description

Error

Request ID: W5scCArAACwAAJFlWbwAAABT
Context: en.wikipedia.org, http_method=POST, url=/w/api.php?format=json&action=edit

channel: fatal
reqId: W5scCArAACwAAJFlWbwAAABT

PHP Fatal Error: request has exceeded memory limit

#0 /srv/mediawiki/php-1.32.0-wmf.20/extensions/Echo/includes/DiscussionParser.php(682): NO_FUNCTION_GIVEN()
#1 /srv/mediawiki/php-1.32.0-wmf.20/extensions/Echo/includes/DiscussionParser.php(598): EchoDiscussionParser::getFullSection()
#2 /srv/mediawiki/php-1.32.0-wmf.20/extensions/Echo/includes/DiscussionParser.php(467): EchoDiscussionParser::interpretDiff(array, string, Title)
#3 /srv/mediawiki/php-1.32.0-wmf.20/extensions/Echo/includes/DiscussionParser.php(37): EchoDiscussionParser::getChangeInterpretationForRevision(Revision)
#4 /srv/mediawiki/php-1.32.0-wmf.20/extensions/Echo/includes/EchoHooks.php(525): EchoDiscussionParser::generateEventsForRevision(Revision, boolean)
#5 /srv/mediawiki/php-1.32.0-wmf.20/includes/deferred/MWCallableUpdate.php(34): Closure$EchoHooks::onPageContentSaveComplete()
#6 /srv/mediawiki/php-1.32.0-wmf.20/includes/deferred/DeferredUpdates.php(268): MWCallableUpdate->doUpdate()
#7 /srv/mediawiki/php-1.32.0-wmf.20/includes/deferred/DeferredUpdates.php(214): DeferredUpdates::runUpdate(MWCallableUpdate, Wikimedia\Rdbms\LBFactoryMulti, string, integer)
#8 /srv/mediawiki/php-1.32.0-wmf.20/includes/deferred/DeferredUpdates.php(134): DeferredUpdates::execute(array, string, integer)
#9 /srv/mediawiki/php-1.32.0-wmf.20/includes/MediaWiki.php(914): DeferredUpdates::doUpdates(string)
#10 /srv/mediawiki/php-1.32.0-wmf.20/includes/MediaWiki.php(734): MediaWiki->restInPeace(string, boolean)
#11 [internal function]: Closure$MediaWiki::doPostOutputShutdown()

Notes

This OOM error is seen about 50 times per day, searching for "exceeded memory limit" AND Echo in Logstash. While it could be a coincidence, the fact that all of them run out of memory at the exact same stack frame in EchoDiscussionParser suggests otherwise

Original task description
  1. Error ` PHP Warning: Destructor threw an object exception: exception 'Wikimedia\Rdbms\DBTransactionError' with message 'Wikimedia\Rdbms\LBFactory::shutdown: transaction round 'MWCallableUpdate::doUpdate' still running.' in /srv/mediawiki/php-1.32.0-wmf.20/includes/libs/rdbms/lbfactory/LBFactory.php:257 Stack trace: #0 /srv/mediawiki/php-1.32.0-wmf.20/includes/libs/rdbms/lbfactory/LBFactory.php(179): Wikimedia\Rdbms\LBFactory->commitMasterChanges() #1 /srv/mediawiki/php-1.32.0-wmf.20/includes/libs/rdbms/lbfactory/LBFactory.php(152): Wikimedia\Rdbms\LBFactory->shutdown() #2 /srv/mediawiki/php-1.32.0-wmf.20/includes/libs/rdbms/lbfactory/LBFactory.php(728): Wikimedia\Rdbms\LBFactory->destroy() #3 (): Wikimedia\Rdbms\LBFactory->__destruct() #4 {main} `
  2. Notes Seen with different wikis and web servers, including www.wikidata.org. POST requests with url /w/api.php or url=/w/api.php?format=json&action=edit.

Event Timeline

Krinkle created this task.Sep 14 2018, 3:38 AM
Restricted Application added a subscriber: Aklapper. · View Herald TranscriptSep 14 2018, 3:38 AM
Krinkle added a comment.EditedSep 14 2018, 3:41 AM

The stacktrace from the destructor doesn't explain what happened earlier in the request, and given its infrequency (few times a day), will be hard to reproduce/research without more information. However, using the request ID I was able to correlate to a fatal.

channel: fatal
reqId: W5scCArAACwAAJFlWbwAAABT

PHP Fatal Error: request has exceeded memory limit

#0 /srv/mediawiki/php-1.32.0-wmf.20/extensions/Echo/includes/DiscussionParser.php(682): NO_FUNCTION_GIVEN()
#1 /srv/mediawiki/php-1.32.0-wmf.20/extensions/Echo/includes/DiscussionParser.php(598): EchoDiscussionParser::getFullSection()
#2 /srv/mediawiki/php-1.32.0-wmf.20/extensions/Echo/includes/DiscussionParser.php(467): EchoDiscussionParser::interpretDiff(array, string, Title)
#3 /srv/mediawiki/php-1.32.0-wmf.20/extensions/Echo/includes/DiscussionParser.php(37): EchoDiscussionParser::getChangeInterpretationForRevision(Revision)
#4 /srv/mediawiki/php-1.32.0-wmf.20/extensions/Echo/includes/EchoHooks.php(525): EchoDiscussionParser::generateEventsForRevision(Revision, boolean)
#5 /srv/mediawiki/php-1.32.0-wmf.20/includes/deferred/MWCallableUpdate.php(34): Closure$EchoHooks::onPageContentSaveComplete()
#6 /srv/mediawiki/php-1.32.0-wmf.20/includes/deferred/DeferredUpdates.php(268): MWCallableUpdate->doUpdate()
#7 /srv/mediawiki/php-1.32.0-wmf.20/includes/deferred/DeferredUpdates.php(214): DeferredUpdates::runUpdate(MWCallableUpdate, Wikimedia\Rdbms\LBFactoryMulti, string, integer)
#8 /srv/mediawiki/php-1.32.0-wmf.20/includes/deferred/DeferredUpdates.php(134): DeferredUpdates::execute(array, string, integer)
#9 /srv/mediawiki/php-1.32.0-wmf.20/includes/MediaWiki.php(914): DeferredUpdates::doUpdates(string)
#10 /srv/mediawiki/php-1.32.0-wmf.20/includes/MediaWiki.php(734): MediaWiki->restInPeace(string, boolean)
#11 [internal function]: Closure$MediaWiki::doPostOutputShutdown()

This OOM error is seen about 50 times per day, searching for "exceeded memory limit" AND Echo in Logstash. While it could be a coincidence, the fact that all of them run out of memory at the exact same stack frame in EchoDiscussionParser suggests otherwise.

I'll rename the task given that the destructor warning is merely a side-effect.

Krinkle renamed this task from LBFactory destructor throws DBTransactionError: "transaction round 'MWCallableUpdate::doUpdate' still running" to Fatal error "request has exceeded memory limit" from Echo DiscussionParser.Sep 14 2018, 3:47 AM
Krinkle updated the task description. (Show Details)
Krinkle edited projects, added Notifications; removed MediaWiki-Page-editing, MediaWiki-API.
Restricted Application added a project: Growth-Team. · View Herald TranscriptSep 14 2018, 3:47 AM
Krinkle updated the task description. (Show Details)Sep 14 2018, 3:48 AM

Seems like this code is just problematic in general, given T203930: EchoDiscussionParser is slow, causes timeouts.

Change 462754 had a related patch set uploaded (by Thiemo Kreuz (WMDE); owner: Thiemo Kreuz (WMDE)):
[mediawiki/extensions/Echo@master] Use \h instead of \s in regular expressions

https://gerrit.wikimedia.org/r/462754

Change 462759 had a related patch set uploaded (by Thiemo Kreuz (WMDE); owner: Thiemo Kreuz (WMDE)):
[mediawiki/extensions/Echo@master] Remove expensive regular expression that doesn't have any effect

https://gerrit.wikimedia.org/r/462759

SBisson moved this task from Inbox to External on the Growth-Team board.Sep 28 2018, 2:36 AM

Change 462759 merged by jenkins-bot:
[mediawiki/extensions/Echo@master] Remove expensive regular expression that doesn't have any effect

https://gerrit.wikimedia.org/r/462759

Change 465762 had a related patch set uploaded (by Krinkle; owner: Thiemo Kreuz (WMDE)):
[mediawiki/extensions/Echo@wmf/1.32.0-wmf.24] Remove expensive regular expression that doesn't have any effect

https://gerrit.wikimedia.org/r/465762

Change 465762 merged by jenkins-bot:
[mediawiki/extensions/Echo@wmf/1.32.0-wmf.24] Remove expensive regular expression that doesn't have any effect

https://gerrit.wikimedia.org/r/465762

Mentioned in SAL (#wikimedia-operations) [2018-10-10T21:48:50Z] <krinkle@deploy1001> Synchronized php-1.32.0-wmf.24/extensions/Echo/includes/DiscussionParser.php: T204291 - Ia5323b401b94 (duration: 00m 51s)

Change 462754 merged by jenkins-bot:
[mediawiki/extensions/Echo@master] Use \h instead of \s in regular expressions

https://gerrit.wikimedia.org/r/462754

Krinkle added a subscriber: tstarling.

The diff in question has a large number of diff ops, but only one section. So for every diff op, getSectionStartIndex() and getSectionEndIndex() have to run a regex match on almost every line in the page, looking for headings. I think a better algorithm would be to have a cache giving the section start index for each line. If a line is not in the cache, then do the preg_match() on the current line, and if it is not a section header, repeat for the previous line with a new cache fetch. And do the same for getSectionEndIndex() looking forwards.
In other words, imagine getSectionStartIndex($i) as the recursive function

function getSectionStartIndex( $i ) {
   if ( $i == 0 ) {
      return 0;
   } elseif ( getSectionCount( $lines[$i] ) ) {
      return $i;
   } else {
      return getSectionStartIndex( $i - 1 );
   }
}

And then memoize that function and then implement it iteratively rather than recursively.
I appreciate that it's difficult to do caching of any kind in a parser implemented as a collection of static functions. It may be easier to convert it to non-static before you do this.

mmodell changed the subtype of this task from "Task" to "Production Error".Aug 28 2019, 11:08 PM