The current limit for uploads is 1GB. Nowadays were full HD videos are available, this limit is very low, and practically limits full hd videos to several minutes. Please raise this limit all together or at least if the file type is ogv or webm.
Description
Details
Subject | Repo | Branch | Lines +/- | |
---|---|---|---|---|
Raise file upload limit to 2047MB | operations/mediawiki-config | master | +1 -1 |
Related Objects
Event Timeline
Are there technical reasons for the current limitation (described on https://commons.wikimedia.org/wiki/Commons:Maximum_file_size )?
If so, have they been sorted out?
Where would the "new" max limit be?
I don't know, I haven't even begun to look at the technical implications. I'm inclined not to raise the limit blindly. The upload varnishes are already a special case now, I'm inclined to think they're not the right architecture for streaming or uploading enormous files like these.
Talked this over a bit with mark, I think we could do this if we set varnish to hit_for_pass in vcl_fetch on files over a certain size (probably somewhere in the midst of the 100MB to 1GB range we're caching for large files now), so that the largest ones aren't cached at the varnish layer at all. Needs some VCL changes to support.
I would like the idea of refining this by media type. For example 100MB SVG files are a mad idea and we should probably reject any, but we have perfectly valid 250MB JPEG files and keeping a 1GB limit on image-types is still practical.
At the same time, being able to upload, say, a 5GB video file could make sense, if we are going to be able to take higher quality videos than the current effective limits of sub-HD resolutions. With institutions becoming interested in donating professional quality video and current examples of 1GB+ files being made public domain by the White House, making it easier to upload larger video files could encourage more video related projects.
For all types there are exceptions, but handling these with "ask an admin (or certain other rights holders) to do it for you" is a lot better than having to raise a Phabricator request just to upload a file.
Remember, if you increase this limit, also need to increase url-downloader.wikimedia.org proxy limit.
I'm doubtful the video scalars would be able to convert multi-gb HD video files with in the (1 hour?) timeout they currently have set. Of course, its better to have the videos and not have derrivitive transcodes, then to not have the video at all.
The implications do depend on file type somewhat. For image files, most multi-gb image files will probably fail to render. Just the time to get the file from swift -> image scalar is problematic [Videos don't have that problem because they aren't rendered on-demand on view].
I'm sure opinions on the role of commons vs e.g. archive.org and what we should be hosting to what ends differ within the community and the organization, but at a technical level there's a bottom line here: we don't currently have the infrastructure or manpower to scale up to, say, accepting and serving 10x or 100x the video we do today. Doing so would require an organizational initiative around it with staffing and budget. Massive video traffic is not a cheap or easy thing to handle well.
We're doing ok with what we have today (although it does strain us at some layers), but we're not ready to flip a switch where the up+down video traffic undergoes a massive increase (due not just to the length of the files, but the increased perceived utility of our video storage/playback for millions of users). Allowing the automated inbound flow of much larger video content could end up being that switch in practice.
I'm not rejecting the idea outright, I just think these kinds of thoughts need to be part of the discussion.
We have a lot of things making video support be bad. I suspect that there are manybothings blocking commons from being a video paradise.
For reference, here is the current distribution of video sizes:
MariaDB [commonswiki_p]> select count(*), power( 2, floor( log2( img_size ) ) ) from image where img_media_type = 'VIDEO' group by floor(log2(img_size)) order by floor(log2(img_size)) desc limit 50;
count(*) | power( 2, floor( log2( img_size ) ) ) |
20 | 2147483648 |
56 | 1073741824 |
206 | 536870912 |
370 | 268435456 |
830 | 134217728 |
4449 | 67108864 |
5969 | 33554432 |
7791 | 16777216 |
8460 | 8388608 |
7535 | 4194304 |
6326 | 2097152 |
5024 | 1048576 |
3958 | 524288 |
2910 | 262144 |
2091 | 131072 |
1054 | 65536 |
583 | 32768 |
275 | 16384 |
109 | 8192 |
49 | 4096 |
3 | 2048 |
So out of 58,068 videos, only 1482 (2%) are larger than 128mb. Only 10% are larger than 64 MB. So it seems as it stands only a smaller percentage of videos are even nearing the upper reaches of what we allow, so I don't think further increasing the limit will open a sudden floodgate.
Change 266544 had a related patch set uploaded (by Krinkle):
Raise file upload limit to 2047MB
https://commons.wikimedia.org/wiki/Help:Server-side_upload mentions UW does up to 1GiB - is that now also raised by this change?
@Dzahn First test in video2commons tool, failed: T128358: Uploading 1.2GB ogv results in 503