Avoid timeouts on large articles
Closed, ResolvedPublic

Description

Huge and template-heavy articles like http://en.wikipedia.org/wiki/List_of_Advanced_Dungeons_%26_Dragons_2nd_edition_monsters take longer than the varnish backend timeout of 60 seconds to render currently. VE has a 100-second timeout currently, but even if that times out we should still let the backend finish the rendering so that the next request can use the cached copy.

See also: VE bug 50475.


Version: unspecified
Severity: normal

bzimport set Reference to bz51053.
GWicke created this task.Jul 9 2013, 5:40 PM

Change 72681 had a related patch set uploaded by GWicke:
Increase Parsoid backend timeout to 5 minutes

https://gerrit.wikimedia.org/r/72681

The increased timeout is now deployed. Editing http://en.wikipedia.org/wiki/List_of_Advanced_Dungeons_%26_Dragons_2nd_edition_monsters works on second try. The client connection times out the first time (probably the 100 second timeout), but on retry the by-then cached copy is returned.

Needs additional testing on really large pages that take close to 5 minutes to render. It is not clear that Varnish will continue with the request if the client connection was dropped before the page is done.

I will continue to monitor the Varnish logs for timeouts, but so far things are looking good. Resolving as fixed.

Add Comment