Page MenuHomePhabricator

"HTTP request timed out" for large files
Closed, DuplicatePublic

Description

During an attempted GWToolset upload last year I encountered a problem when trying to upload some larger images 730Mb. The problem I encounter is that I get a HTTP timeout reply already at the second stage of the upload process.

@dan-nl has also tried debugging this one for a while and concluded that there seems to be something wrong with the http-request that uses curl to discover the headers of the item which for some reason times out even though it's only requesting headers and not the file.

For testing the relevant xml is

, the mapping is at GWToolset:Metadata Mappings/Lokal Profil/KB-maps.json and an example image for further testing can be found at https://data.kb.se/datasets/2014/06/kartor/2882568_53_Skargardskriget_hagelsta.tif

Event Timeline

Lokal_Profil raised the priority of this task from to Needs Triage.
Lokal_Profil updated the task description. (Show Details)
Lokal_Profil added subscribers: Lokal_Profil, dan-nl.

I have started uploading these files manually (well using python). The underlying problem in GWToolset still remains though.

It looks like the current timeout is 90s, it should be more something like 1200s. Because of that bug we have GLAM partners stuck in their upload processes. We should raise the priority of that bug.

Kelson raised the priority of this task from Low to Unbreak Now!.Dec 23 2015, 5:06 PM
Aklapper lowered the priority of this task from Unbreak Now! to High.Dec 24 2015, 1:53 PM

Because of that bug we have GLAM partners stuck in their upload processes.

More information welcome.

Because of that bug we have GLAM partners stuck in their upload processes.

More information welcome.

I don't know what I can say more, maybe reading the following comment might help
https://phabricator.wikimedia.org/T119053#1901103

... anyway, pretty sure the overall HTTP request timeout for the GWT is 90s and this is definitely to short to download for example 200 MB over 10Mb/s bandwidth.

Should we not close that task because T119053 is fixed. It sounds to me to be a duplicate of T119053.

Mhutti1 claimed this task.

There are many timeouts, one will still hit them given a big enough file, but the situation is much improved, so i do think it does make sense to close