Page MenuHomePhabricator

Use optimised version instead of original when original size is used as thumbnail
Closed, DuplicatePublic

Description

In srcset it often happens that original image is referenced when it's smaller than "2x" the thumb size.

But the originally uploaded image is unneccessarily big in filesize: It has not passed ImageMagick, thus it contains useless metadata and may have an unwanted high quality. I use jpeg compression of 69%. Most of all the originals are delivered too big.

That's even more f***ed since srcset is used for mobile devices where bandwidth is low.

Solution: Pass originals through the thumb process and call these in content display.

The real originals should only be linked (not embedded) on the image page.

Event Timeline

Subfader raised the priority of this task from to Needs Triage.
Subfader updated the task description. (Show Details)
Subfader subscribed.
Subfader set Security to None.
Subfader updated the task description. (Show Details)
Subfader updated the task description. (Show Details)

What is a link to a specific example to see this problem?

What is a link to a specific example to see this problem?

Bawolff renamed this task from Compress original images for display in content view (srcset) to Do not return original image assets, but always run through image magick (to strip metadata, etc).Jun 11 2015, 6:32 PM
Bawolff triaged this task as Low priority.
Bawolff edited projects, added Multimedia; removed MediaWiki-ContentHandler.

I'm unsure if we should do this. /me leans towards no, but is unsure

Maybe it should be a config option.


Technical version of this bug - subfader is essentially asking that MediaHandler::mustRender always be false

Subfader renamed this task from Do not return original image assets, but always run through image magick (to strip metadata, etc) to Do not return original image assets, but always run through imagemagick (to strip metadata, etc).Jul 18 2015, 9:49 AM

I'm not sure if you all are aware how bad this is in performance.

MediaWiki is not Wikipedia.

I'm not sure if you all are aware how bad this is in performance.

Well enlighten me then. Given average images, what percentage overhead are we talking about (For a normal MW install. I don't think the decision should be based on the lower jpeg quality you are using)? How often does image magick make an image larger than it was originally (when controlling for size)? How much does an unnessary resize degrade quality (Probably not much, probably not even noticable, but should be considered)? How often do people upload images at precisely the size they wanted, with the intention that that exact version is used (Probably not a lot, but might not be zero)? How often do people intentionally include metadata that they want in the image, and are actually mad about resizing stripping it? How often is there some sort of problem with resizing images, where forcing an image to go through image magick will take something that partially works, and then make it not work at all (This happens sometimes on Wikipedia, especially with large images, and especially with GIFs. This sort of thing thends to happen significantly more often

That said, I really don't know. Maybe it does make sense to change the behaviour. The risks and benefits are both rather unclear to me, which makes me think its best to keep it as is.

MediaWiki is not Wikipedia.

The performance analysis is the same as far as I can tell when you substitute Wikipedia for average 3rd party install. (From a client side perspective. From a server side perspective in a bandwidth limited environment it might be different, but to what extent it is, that's unclear).

If you are super concerned about image bandwidth, to the point where you are significantly lowering the quality settings, I imagine you probably don't really want High-DPI/srcset support either(?)

Wikipedia's small thumbs are not the standard use case for wiki thumbs.

I think it's more simple:

It can happen that original images are smaller than the thumb px.
If an image is not so large, users define no width pararameter or the original one.
The user can use higher default thumb width.
In srcset 2x it's even more likely that the original is returned.

To sum it up:
Could MediaWiki prevent returning unnecessarily large images?
Why doesn't it?

Subfader renamed this task from Do not return original image assets, but always run through imagemagick (to strip metadata, etc) to Do not return original image assets, but always run through imagemagick (performance boost).Jul 18 2015, 10:25 AM
Krinkle renamed this task from Do not return original image assets, but always run through imagemagick (performance boost) to Use optimised version instead of original when original size is used as thumbnail.Sep 4 2015, 4:37 PM
Krinkle raised the priority of this task from Low to Medium.

Rephrased. This is a genuine bug.

When MediaWiki requests a thumbnail (either through wikitext Parser or when requesting thumbs from the API), it should always use a thumbnail.

Even if the requested thumbnail size is the same size or larger than the original, we should serve a thumbnail-style version of the original size, not the original itself.

In addition to performance gains, this also makes the behaviour more predictable and consistent. And it solves various rendering bugs due to caching because the original image can change (re-upload) at which point consumers are stuck with a url that may change to a wildly different size. (e.g. when re-uploading a 5000px image over a 300px low-res image, something that happens quite often on Commons when users upload low-res versions and/or when new originals come available from content providers). Thumbnail urls are supposed to encode their size in the url.

If this was fixed, then maintenance tasks like "optimize the thumb directory" would have a full effect: Always display optimized thumbs and don't touch the original upload files.
https://phabricator.wikimedia.org/T111633

Possible dupe of T67383?

Beyond file size, this is also useful for thumbnails' applied EXIF rotation.

This is even more important when you additionally create webp thumbs (up to 30-50% less file size).

Just returning the original is just lame.

And originals are returned a lot with extensions like MediaViewer...

#perfmatters