Page MenuHomePhabricator

Cryptic "413 Request Entity Too Large" error when uploading new version of >100MB file on Commons
Closed, DuplicatePublic

Description

Using 'Upload a new version of this file' on Commons for large files (largest successful upload 53MB) instantly fails after pressing upload with error '413 Request Entity Too Large nginx/1.9.4'

This has happened to me with .webm videos over 100MB (but much less than 1000MB limit, no other file type has been attempted.

Event Timeline

John_Cummings raised the priority of this task from to High.
John_Cummings updated the task description. (Show Details)
John_Cummings added a project: Commons.
John_Cummings subscribed.

I'm afraid that's "expected" though the error message could be way more descriptive.
Only 100 MB uploads are supported, see https://commons.wikimedia.org/wiki/Help:Server-side_upload

UploadWizard has a setting to allow 1000MB, but UploadWizard is not supported for "Upload new version of file".

Aklapper renamed this task from 413 Request Entity Too Large error when uploading new version of file on Commons to Cryptic "413 Request Entity Too Large" error when uploading new version of >100MB file on Commons.Oct 20 2015, 11:02 AM
Aklapper raised the priority of this task from High to Needs Triage.
Aklapper set Security to None.

Is there a way around this issue? A way to overwrite the file without using the 'Upload a new version of this file' function?

In T115984#1738166, @Mrjohncummings wrote:

Is there a way around this issue?

https://commons.wikimedia.org/wiki/Help:Server-side_upload or reducing the file size, I'd say

Only 100 MB uploads are supported

There's a bug on Special:Upload where the wrong limit is displayed

Is there a way around this issue? A way to overwrite the file without using the 'Upload a new version of this file' function?

use https://commons.wikimedia.org/wiki/User_talk:Rillke/bigChunkedUpload.js instead

Hi Bawolff

Thanks so much for a solution, it took 1 minute to set up and worked perfectly :)

Thanks again

John

Is there a way around this issue? A way to overwrite the file without using the 'Upload a new version of this file' function?

use https://commons.wikimedia.org/wiki/User_talk:Rillke/bigChunkedUpload.js instead

FAILED: internal_api_error_MWException: [e100eb88] Exception Caught: No specifications provided to ArchivedFile constructor.

Same error in both cases: when stash is enabled / disabled.
(while uploading a 130MB DJVU file for Wikisource)

Lowering quality is not an option in this case.

FAILED: internal_api_error_MWException: [e100eb88] Exception Caught: No specifications provided to ArchivedFile constructor.

Same error in both cases: when stash is enabled / disabled.
(while uploading a 130MB DJVU file for Wikisource)

That's a separate bug, T94562: Chunked/stashed uploads fail for some pdf and djvu files: "No specifications provided to ArchivedFile constructor.".

So I guess what this bug is asking for, is that we have a custom 413 error page, like how we have custom 404 and 503 error pages.

So I guess what this bug is asking for, is that we have a custom 413 error page, like how we have custom 404 and 503 error pages.

Not only. It is also about synchronizing MediaWiki messages about supported upload size (via Special:Upload) with the limits that servers really support to stop misleading users.

Note, that Special:Upload is not the preferred upload method and users are directed to use it mostly in case when uploading using stash failes. So the above is necessary to avoid directing users from one bug to another.

Let's just say clearly that uploading >100MB files with text layer (PDF, DjVu) is unsupported now.

So I guess what this bug is asking for, is that we have a custom 413 error page, like how we have custom 404 and 503 error pages.

Not only. It is also about synchronizing MediaWiki messages about supported upload size (via Special:Upload) with the limits that servers really support to stop misleading users.

Yes, but that's already being taken care of over at https://gerrit.wikimedia.org/r/#/c/248357/

Note, that Special:Upload is not the preferred upload method and users are directed to use it mostly in case when uploading using stash failes. So the above is necessary to avoid directing users from one bug to another.

Let's just say clearly that uploading >100MB files with text layer (PDF, DjVu) is unsupported now.

I don't think that has anything to do with Special:Upload, which is what this bug was filed about. Well there is that bug for chunked upload (And looking at the history of T94562, I can understand why some folks might be frustrated over the efforts to solve that bug), the truth is not quite that clear. It only affects files with OCR layers > about 60 kb. And it only applies to chunked & stashed uploads. Some other upload methods that are less applicable to average users (Upload by url - need to be an admin, Upload by gwtoolset - need to be a gwtoolset user, Upload by server-side - need to go through an annoying bureaucratic process) have neither the ocr layer bug, nor the 100 MB file size limit.

So I guess what this bug is asking for, is that we have a custom 413 error page, like how we have custom 404 and 503 error pages.

Even better would be to let MediaWiki handle this error, rather than blowing up at some caching layer. It is possible to detect this issue in PHP code (see e.g. http://andrewcurioso.com/blog/archive/2010/detecting-file-size-overflow-in-php.html, MediaWiki seems to have similar code in WebRequestUpload::isIniSizeOverflow()), but it never gets to handle this request.

So I guess what this bug is asking for, is that we have a custom 413 error page, like how we have custom 404 and 503 error pages.

Even better would be to let MediaWiki handle this error, rather than blowing up at some caching layer. It is possible to detect this issue in PHP code (see e.g. http://andrewcurioso.com/blog/archive/2010/detecting-file-size-overflow-in-php.html, MediaWiki seems to have similar code in WebRequestUpload::isIniSizeOverflow()), but it never gets to handle this request.

I think its better to discard such requests quickly. Don't really want to send an extra gb of post data to the apache servers unnecessarily,

I was imagining sending just the headers onward, without the POST data, although on further thought that might not be possible (would Content-Length not matching the actual content length break things?). Anyway, I know nothing about this all, only that this is a Wikimedia config issue and not a MediaWiki issue :)

MarkTraceur subscribed.