Page MenuHomePhabricator

504 Gateway Time-out on https://de.wikipedia.org/w/index.php?title=Wikipedia:L%C3%B6schkandidaten&action=info
Closed, ResolvedPublic

Description

An user pointed me out these page properties cannot be read https://de.wikipedia.org/w/index.php?title=Wikipedia:L%C3%B6schkandidaten&action=info the strangest thing is the attached error message: I got *his* IP address instead of mine

I tried to open pages with a similar amount of revs (12k) and subpages and it worked fine.

Details

Event Timeline

Vituzzu created this task.Jan 28 2017, 12:58 AM
Restricted Application added a subscriber: Aklapper. · View Herald TranscriptJan 28 2017, 12:58 AM
Vituzzu updated the task description. (Show Details)Jan 28 2017, 12:59 AM
zhuyifei1999 added a subscriber: zhuyifei1999.

Added Operations since their assistance may be needed to debug page hanging.

I get a 504 Gateway Time-out instead of the 503 in the attachment of this task:

$:andre\> curl -v https://de.wikipedia.org/w/index.php?title=Wikipedia:L%C3%B6schkandidaten\&action=info
[...]
* Connection state changed (MAX_CONCURRENT_STREAMS updated)!
< HTTP/2.0 504
< server:nginx/1.11.6
< date:Sat, 28 Jan 2017 18:28:10 GMT
< content-type:text/html
< content-length:183
< 
<html>
<head><title>504 Gateway Time-out</title></head>
<body bgcolor="white">
<center><h1>504 Gateway Time-out</h1></center>
<hr><center>nginx/1.11.6</center>
</body>
</html>
* Connection #0 to host de.wikipedia.org left intact
Aklapper renamed this task from Unable to load page properties of a certain page to 504 Gateway Time-out on https://de.wikipedia.org/w/index.php?title=Wikipedia:L%C3%B6schkandidaten&action=info.Jan 28 2017, 6:40 PM

The query is most likely:

SELECT /* WikiPage::getOldestRevision */ rev_id,rev_page,rev_text_id,rev_timestamp,rev_comment,rev_user_text,rev_user,rev_minor_edit,rev_deleted,rev_len,rev_parent_id,rev_sha1,rev_content_format,rev_content_model FROM `page`,`revision` WHERE page_namespace = '4' AND page_title = 'Löschkandidaten' AND (rev_page = page_id) ORDER BY rev_timestamp ASC LIMIT 1

This makes it faster:

SELECT /* WikiPage::getOldestRevision */ rev_id,rev_page,rev_text_id,rev_timestamp,rev_comment,rev_user_text,rev_user,rev_minor_edit,rev_deleted,rev_len,rev_parent_id,rev_sha1,rev_content_format,rev_content_model FROM `page` JOIN `revision` USE INDEX (page_timestamp) ON rev_page = page_id WHERE page_namespace = '4' AND page_title = 'Löschkandidaten' ORDER BY rev_timestamp ASC LIMIT 1;
jcrespo added a subscriber: daniel.Jan 31 2017, 5:05 PM

Adding @daniel, -not expecting to work on it-, but because I think he mentioned this specific query in particular (getting the first edit) on a recent 1:1 (I may be wrong). In any case, it may be interesting for his work and our ongoing conversations.

Yes, finding the first revision of an article is notoriously expensive, especially on a page like Wikipedia:Löschkandidaten (Articles for Deletion), which has lots of revisions.

For this specific case: Why is this joining against the page table at all? It only selects columns from revision, and we very likely already know the page ID (it's in the Title object). So just ask for the first revision with that page ID, done.

An even better solution would be to have a contributions table, in which revisions are associated with users, and can have a flag for "page creation". Revision deletion can mess with that, though.

Change 335432 had a related patch set uploaded (by Daniel Kinzler):
Avoid joining against page table when finding a page's oldest revision.

https://gerrit.wikimedia.org/r/335432

finding the first revision of an article is notoriously expensive

I do not 100% agree- I do not disagree, either. Let me explain: my slightly changed query works (without a full rewrite), because the issue is that mysql gets confused with so many available indexes. I think this doesn't happen on newer MySQL versions.

I agree that simpler queries produce simpler results, and Daniel solution (not querying what it is not needed), if it works, is much better. I think the core of the issue is (for further work) to make sure the simplest queries are always preferred. So it is 50% human factor 50% technical limitations.

I said if it works, because I do not know how often it will double the number of queries, or the net impact overal.

daniel added a comment.Feb 1 2017, 8:51 PM

We could even take this further: put the ID of a pages first revision into pageprops. This would probably need fallback code for the case when that revision is deleted/supressed. But it should make this query fast 99.99% of the time.

daniel added a comment.Feb 1 2017, 8:52 PM

I said if it works, because I do not know how often it will double the number of queries, or the net impact overal.

I did not check, but I expect that if we are showing page info for a given page, we already know that page's id. Even if we don't, I suspect that running two an additional unique key lookup will not be horribly slow.

Change 335432 merged by jenkins-bot:
Avoid joining against page table when finding a page's oldest revision.

https://gerrit.wikimedia.org/r/335432

Change 335432 merged by jenkins-bot:
Avoid joining against page table when finding a page's oldest revision.
https://gerrit.wikimedia.org/r/335432

This rolled out to all wikis. However, the issue appears to still exist. The url of the opening post still times out for me.

Gilles moved this task from Inbox to Radar on the Performance-Team board.May 26 2017, 4:32 PM

@daniel Perhaps try capturing an XHProf profile from the request to find where most time is being spent? (If it isn't the query)

See https://wikitech.wikimedia.org/wiki/X-Wikimedia-Debug#Request_profiling.

Krinkle triaged this task as Normal priority.May 26 2017, 4:55 PM
Krinkle closed this task as Resolved.Jan 22 2019, 10:14 PM
Krinkle edited projects, added Performance-Team; removed Performance-Team (Radar).

Works for me.