Page MenuHomePhabricator

Experiment with serving lower resolution images as the default for all page views
Closed, DeclinedPublic

Description

Currently the mobile site serves images without srcset/retina support.
Question: Do we gain anything by serving lower resolution images using qlow JPEGs ?

There are 4 options

  1. Serve qlow JPEGs for all and allow opt out
  2. Serve qlow JPEGS for everyone and via JS enhance the images on scroll.
  3. Do nothing (maintain status quo)
  4. Maintain status quo but allow option for user to opt ino qlow image data saving enhancements.

Option 2 in detail:

  • With JavaScript disabled the reader will only see the low-res image. The image may be pixelated but it can still convey information. Suspend judgement on whether this is ethical until experiment has run its course.
  • With JavaScript enabled when an image is almost scrolled into view (at least 2 full viewport heights away):
    • if no srcset support we will replace the low res src attribute with data-src-improved

Event Timeline

Jdlrobson raised the priority of this task from to Medium.
Jdlrobson updated the task description. (Show Details)
Jdlrobson added a project: Web-Team-Backlog.
Jdlrobson subscribed.
Jdlrobson renamed this task from Experiment with serving lower resolution images as the default to Experiment with serving lower resolution images as the default for all page views.Dec 8 2015, 9:22 PM
Jdlrobson set Security to None.

So if I understand correctly, the test plan is to

  • change the images in the HTML to low-res for everyone
  • use Javascript to magick it back to normal for 95% of the users
  • if that test goes well, maybe figure out how the 95% (ie. users with NetSpeed A) can get normal images by default

I still think that's a fundamentally broken strategy.

So if I understand correctly, the test plan is to

  • change the images in the HTML to low-res for everyone
  • use Javascript to magick it back to normal for 95% of the users

Nope test would be to send 5% of users low res images, with JavaScript to magic it back.
Another 5% of users would get normal images with srcset attribute removed (working on assumption T119797 shows positive results)

We'd the compare the impact of these two buckets across the whole site.

  • if that test goes well, maybe figure out how the 95% (ie. users with NetSpeed A) can get normal images by default

The data above would inform us on the benefits of such an approach. There would be several options:

  • Serve low res thumbnails to all users and enhance
  • Serve low res thumbnails only to users on a slow connection and enhance
  • Keep the status quo.

I still think that's a fundamentally broken strategy.

Suspend disbelief until we have some data :)
If it's good enough for a global news site (e.g. BBC) I'm confident this can be justified but right now I'm just keen to get that data.

umm, what about interactions with parser/varnish cache? I'm not sure that it will be only 5% seeing this.

Suspend disbelief until we have some data :)

A fundamentally broken testing strategy, I meant. But apparently Ori implemented the cache splitting in the meantime:

I am a bit confused what this task is about as it's title says "default for all page views", but taken together with those patches the experiment plan seems reasonable. You'd have to make sure to run it for long enough to overcome the cold cache effect for the low-res/no-srcset variants.

Jdlrobson lowered the priority of this task from Medium to Low.Dec 9 2015, 1:13 AM
Jdlrobson added a subscriber: ori.

Essentially I guess the hypothesis I have is "Is serving low resolution images by default for all users on a first visit* a suitable default" - and I want an answer to that with the test above. Ordinary resolutions would be loaded on scroll.

However, since, talking to @ori it seems like our experimentation might be better aimed at serving lead image and defer loading the rest of the content, especially since that would give us that for free.

  • e.g. assuming NetSpeed = B

Is the qlow JPEG setting what you meant by 1/3 size? I got the impression earlier that you were talking about using 1/3 resolution images (ie. when the parser wants a 300px image, return a 100px one and scale it up via CSS). The quality reduction from high JPEG compression seems pretty small.

qlow JPEG setting would be a good start yes given the promising results from Tim in https://phabricator.wikimedia.org/T119797#1867024

I don't see why everyone would not want to benefit from this (with the chance to opt out)

umm, what about interactions with parser/varnish cache? I'm not sure that it will be only 5% seeing this.

The cache is varied on "NetSpeed" cookie.

An opt-out does not prevent people from being outraged over a change they don't like. I had to learn that the hard way :-)

And of course additional preferences always mean a mental burden, borne disproportionately by those who don't use the site a lot. Most of the readers probably wouldn't even realize it's optional (how often do you open the preferences menu of, say, the New York Times? You probably never did, although you probably do read articles there every once in a while).

@bmansurov played around with this in https://gerrit.wikimedia.org/r/#/c/248312

@Tgr history shows little evidence of outrage for changes in the mobile site and I think this such change will be far less noticeable given most images are below the fold. Given the prevalence of 2G connections I'm confident we can build a strong case for why this is being done.

I believe we should not apply qlow to JavaScript capable UAs.

I do however recommend we apply qlow or chrome subsampling by default to <noscript> UAs, and as with convention in Wikipedia Zero, only in the context of an article (i.e., not File: pages). When the user taps through from the article the user can get to the more bandwidth intensive images on File: pages and their sublinks.

This is expressed in T124390: [GOAL] Load images with care as follows:

For UAs without <script> support, embed the simplest possible <noscript><img> tag (alt, src, width, height) per image and use the qlow / chroma sub-sampling type approaches.

I do think we should also consider chroma sub-sampling, but not qlow, for JavaScript capable UAs. This is expressed in T124390: [GOAL] Load images with care as follows:

For UAs with JavaScript executed in the client, attach the logic described earlier for lazy loading images. Consider chroma sub-sampling, but not qlow.

However, I think tasks like T126793: Lazy loading images breaks ResourceLoader-blacklisted JavaScript clients come ahead of these optimizations, as we need to first get things right for ResourceLoader impaired, yet JavaScript capable UAs.

This conversation is happening at too many places. Given that we have no specific plans for experiments anymore, can we merge this into T119797: Serve low-res images by default to users on slow or metered mobile connections (and possibly drop the "slow or metered" part if using the netspeed cookie is not considered anymore)?

This conversation is happening at too many places. Given that we have no specific plans for experiments anymore, can we merge this into T119797: Serve low-res images by default to users on slow or metered mobile connections (and possibly drop the "slow or metered" part if using the netspeed cookie is not considered anymore)?

Agreed. @Jdlrobson, okay to abandon this T120875, or are you wanting to keep this for a rainy day? This task doesn't seem to block T124390: [GOAL] Load images with care, although I was curious if you wanted to keep the task open for a rainy day later on.

Declined in favour of existing lazy loading images implementation.