New user right: "Allow large uploads"
OpenPublic

Description

Author: FT2.wiki

Description:
In the same way that some users can create large numbers of user accounts, because they work in that area and are trusted to do so, there are some users who work in the image/media arena who feel restricted by the limit on upload size for high quality uploads and restorations. Files of over 100 MB are routinely discussed by featured image uploaders; larger files may also be appropriate in some cases.

Can we have a "large uploads" usergroup so that there's some mechanism to allow at least some users to upload large media files without restriction?

It would be a good solution to the problem of setting a common limit which (however large) some media editors in good standing may find restrictive.


Version: unspecified
Severity: enhancement

bzimport added a project: MediaWiki-Uploading.Via ConduitNov 21 2014, 10:58 PM
bzimport added a subscriber: wikibugs-l.
bzimport set Reference to bz21338.
bzimport created this task.Via LegacyOct 29 2009, 2:38 AM
bzimport added a comment.Via ConduitOct 29 2009, 3:00 AM

nadezhda.durova wrote:

Example: http://durova.blogspot.com/2009/10/good-news-about-booker-t-washington.html

Other examples available upon request.

Chad added a comment.Via ConduitOct 29 2009, 4:34 PM

Should be pretty trivial to add a new userright that's not subject to the size checking limits, or is subject to a higher limit.

New groups can easily be added to WMF config once the right exists.

tomasz added a comment.Via ConduitNov 6 2011, 1:00 PM

Does anyone have any idea if such a usergroup is going to be created in a foreseeable future?

I've been uploading many videos to the Commons in the past few months, and asking a developer every time and wasting his time to deal with Bugzilla bugs hasn't been very handy.

Catrope added a comment.Via ConduitNov 11 2011, 1:35 PM

This would make much more sense once chunked uploading is introduced and uploading such large files over HTTP(S) is actually technically feasible.

bzimport added a comment.Via ConduitApr 15 2012, 9:51 AM

rd232 wrote:

Question: if upload-by-url is enabled (Bug 20512 - Enable $wgAllowCopyUploads (upload by URL)), would it make sense to allow users with access to upload-by-url to upload large files by URL? Server-to-server uploads ought to be stable enough for large files.

Rillke added a comment.Via ConduitApr 16 2012, 8:07 PM

#1 upload-by-url will only work within the MW-cluster
#2 chunked upload was buggy; is it fixed now?

brion added a comment.Via ConduitApr 16 2012, 8:14 PM

Known problems with chunked upload have been fixed; we've got an opt-in option ready to go soon to test it in production.

Beware that currently, chunked uploads will only work in reasonably recentish WebKit and Mozilla-based browsers [there's a tweak in review which also makes it work on IE 10].

ArielGlenn added a comment.Via ConduitApr 16 2012, 8:18 PM

It might be nice to be able to cap the number of such uploads by a given user in some short period of time, not because we want to keep out such material, but so that we can manage capacity if this suddenly becomes a popular feature. For the initial rollout it shouldn't be a deal-breaker.

Reedy added a comment.Via ConduitApr 17 2012, 9:01 AM

(In reply to comment #8)

It might be nice to be able to cap the number of such uploads by a given user
in some short period of time, not because we want to keep out such material,
but so that we can manage capacity if this suddenly becomes a popular feature.
For the initial rollout it shouldn't be a deal-breaker.

The User::pingLimiter function should probably make this easy enough to do

Rillke added a comment.Via ConduitApr 17 2012, 5:55 PM

(In reply to comment #7)

Beware that currently, chunked uploads will only work in reasonably recentish WebKit and Mozilla-based browsers

I am using the API. Does it mean I can't compose the multi-part message myself? Do I have to set the User-Agent to mozilla when using my personal implementation? I don't think so -- it's just for users using UpWiz, right?

BTW: A bit more documentation on how to use that feature (e.g. what parameters are required or recommended initially, with each chunk and finally and what possible responses, I have to expect) would be nice; not only for me - finally I got it working - but for all who want to improve their upload bots.

ToAruShiroiNeko added a comment.Via ConduitApr 17 2012, 8:09 PM

I would suggest against a cap, at least for bots. It can be the case where the bot has a large freely licensed video archive to copy to commons.

For instance LOC probably has a fair number of war-era videos that could be processed and uploaded in bulk.

Just my 2 cents.

brion added a comment.Via ConduitApr 17 2012, 8:13 PM

(In reply to comment #10)

(In reply to comment #7)
>Beware that currently, chunked uploads will only work in reasonably recentish WebKit and Mozilla-based browsers
I am using the API. Does it mean I can't compose the multi-part message myself?

If you're an API user, then you can construct your requests however you wish -- chunked upload capabilities simply exist in the API and you can use them at any time.

The limitation is in the UploadWizard front-end, which needs to know it has support for a version of Blob.slice() that acts as expected. (Ideally it would detect this by trying it, rather than checking browser engine versions.)

BTW: A bit more documentation on how to use that feature (e.g. what parameters
are required or recommended initially, with each chunk and finally and what
possible responses, I have to expect) would be nice; not only for me - finally
I got it working - but for all who want to improve their upload bots.

Documentation would be nice, yes. :) I'll see if I can find or improve some...

ToAruShiroiNeko added a comment.Via ConduitMay 8 2012, 10:57 AM

So there is no upload limit if one uses the API?

Catrope added a comment.Via ConduitMay 8 2012, 6:47 PM

(In reply to comment #14)

So there is no upload limit if one uses the API?

No, the same limits apply, but the API allows chunked uploading (upload the file in smaller chunks that are recombined server-side. There is now an experimental preference that you can enable that will make UploadWizard use chunked uploading as well. There is no support for chunked uploading in Special:Upload.

For this bug, we would add a new user right that allows users to upload large (define large? up to 1 GB?) files. Chunked upload is the only way to upload files larger than 100MB, that's why the two are linked. Now that chunked upload seems to be working and deployed (experimentally, but still), that's something we could work on.

ToAruShiroiNeko added a comment.Via ConduitMay 9 2012, 12:54 AM

I'd say 4.5GB (the size of a DVD) as I am trying to put in a 1.4GB file (video) already.

Also a progress bar would be more than helpful.

Fastily added a comment.Via ConduitMay 14 2012, 7:10 AM

I'd support a 4.5 GB limit. That's more than sufficient for the videos I'm trying to upload. Also, anyone interested in this bug should take notice of https://bugzilla.wikimedia.org/show_bug.cgi?id=36829. Chunked uploads with Upload Wizard chokes up on files >300mb

Dereckson added a comment.Via ConduitSep 7 2012, 7:03 PM

By the way, the user by default now can upload <= 500 Mb files, instead 100 Mb at bug opening.

Fastily added a comment.Via ConduitJun 8 2013, 10:02 PM
  • Bug 36687 has been marked as a duplicate of this bug. ***
555 added a comment.Via ConduitFeb 20 2014, 11:49 PM

Adding bug 35925 (tracking), isn't hard to found book digitizations larger than the current upload limit

Fastily added a comment.Via ConduitFeb 20 2014, 11:56 PM

(In reply to 555 from comment #20)

Adding bug 35925 (tracking), isn't hard to found book digitizations larger
than the current upload limit

Not to mention video, where only a few minutes of 1080p HD quickly exceeds 1GB

Gilles added a project: Multimedia.Via WebNov 24 2014, 3:42 PM
Chad removed a subscriber: Chad.Via WebDec 16 2014, 6:10 PM

Add Comment

Column Prototype
This is a very early prototype of a persistent column. It is not expected to work yet, and leaving it open will activate other new features which will break things. Press "\" (backslash) on your keyboard to close it now.