Editing very high-use templates results in a timeout in BacklinkCache::partition()
Closed, ResolvedPublic

Description

When someone edits a very high-use template (a template with millions of transclusions), the template edit doesn't finish cleanly. Instead of the template page reloading on page save, the user is presented with a read timeout error message. This read timeout behavior has existed for a few years now.


Version: unspecified
Severity: normal

bzimport set Reference to bz37731.
MZMcBride created this task.Via LegacyJun 20 2012, 2:31 AM
MZMcBride added a comment.Via ConduitJun 20 2012, 2:35 AM

According to Tim Starling in #wikimedia-tech just now:

[20-Jun-2012 02:28:44] Fatal error: Maximum execution time of 180 seconds exceeded at /usr/local/apache/common-local/php-1.20wmf5/includes/db/DatabaseMysql.php on line 285
that's me
and it fails in BacklinkCache->partition, very nice

This was as a result of this edit: https://commons.wikimedia.org/w/index.php?diff=prev&oldid=72965783. The template has 2,783,343 transclusions according to the Toolserver's commonswiki_p right now.

tstarling added a comment.Via ConduitJun 20 2012, 3:13 AM

Updated summary to reflect cause.

tstarling added a comment.Via ConduitJul 3 2012, 3:56 AM

Workaround method to queue jobs from the command line:
https://commons.wikimedia.org/wiki/User:Tim_Starling/fixing_link_tables

duplicatebug added a comment.Via ConduitNov 28 2012, 1:26 PM

(In reply to comment #4)

https://gerrit.wikimedia.org/r/#/c/32488/

Status Merged

bzimport added a comment.Via ConduitDec 27 2012, 12:11 AM

afeldman wrote:

https://gerrit.wikimedia.org/r/#/c/32488/ added a limit to BacklinkCache::getNumLinks but some related jobs are still failing in BacklinkCache::getLinks like this:

Wed Dec 26 23:40:42 UTC 2012 mw14 commonswiki BacklinkCache::getLinks 10.0.6.61 2008 MySQL client ran out of memory (10.0.6.61) SELECT /*! STRAIGHT_JOIN */ page_namespace,page_title,page_id FROM templatelinks,page WHERE tl_namespace = '10' AND tl_title = 'Date' AND (page_id=tl_from) ORDER BY tl_from

That query returns 12384915 rows and would have to be batched.

aaron added a comment.Via ConduitDec 27 2012, 1:10 AM

(In reply to comment #6)

https://gerrit.wikimedia.org/r/#/c/32488/ added a limit to
BacklinkCache::getNumLinks but some related jobs are still failing in
BacklinkCache::getLinks like this:

Wed Dec 26 23:40:42 UTC 2012 mw14 commonswiki
BacklinkCache::getLinks
10.0.6.61 2008 MySQL client ran out of memory (10.0.6.61)
SELECT
/*! STRAIGHT_JOIN */ page_namespace,page_title,page_id FROM
templatelinks,page WHERE tl_namespace = '10' AND tl_title = 'Date' AND
(page_id=tl_from) ORDER BY tl_from

That query returns 12384915 rows and would have to be batched.

Moved to bug 43452 since this is about users getting timeouts.

Add Comment

Column Prototype
This is a very early prototype of a persistent column. It is not expected to work yet, and leaving it open will activate other new features which will break things. Press "\" (backslash) on your keyboard to close it now.