Avoid timeouts on large articles
Closed, ResolvedPublic

Description

Huge and template-heavy articles like http://en.wikipedia.org/wiki/List_of_Advanced_Dungeons_%26_Dragons_2nd_edition_monsters take longer than the varnish backend timeout of 60 seconds to render currently. VE has a 100-second timeout currently, but even if that times out we should still let the backend finish the rendering so that the next request can use the cached copy.

See also: VE bug 50475.


Version: unspecified
Severity: normal

bzimport added a project: Parsoid-Web-API.Via ConduitNov 22 2014, 2:05 AM
bzimport set Reference to bz51053.
GWicke created this task.Via LegacyJul 9 2013, 5:40 PM
gerritbot added a comment.Via ConduitJul 9 2013, 5:41 PM

Change 72681 had a related patch set uploaded by GWicke:
Increase Parsoid backend timeout to 5 minutes

https://gerrit.wikimedia.org/r/72681

GWicke added a comment.Via ConduitJul 10 2013, 6:01 PM

The increased timeout is now deployed. Editing http://en.wikipedia.org/wiki/List_of_Advanced_Dungeons_%26_Dragons_2nd_edition_monsters works on second try. The client connection times out the first time (probably the 100 second timeout), but on retry the by-then cached copy is returned.

Needs additional testing on really large pages that take close to 5 minutes to render. It is not clear that Varnish will continue with the request if the client connection was dropped before the page is done.

GWicke added a comment.Via ConduitJul 12 2013, 3:29 AM

I will continue to monitor the Varnish logs for timeouts, but so far things are looking good. Resolving as fixed.

Add Comment