Page MenuHomePhabricator

Vary mobile HTML by connection speed
Closed, DeclinedPublic

Description

For T119797, we need a fast way to determine the client connection speed in Varnish and either vary the response accordingly (so we serve HTML with low-res images for clients on slow connections) or set a cookie, so that client-side code can rewrite the page to use low-res variants.

We have a subscription to MaxMind's GeoIP2 Connection Speed database, so presumably the varnish GeoIP code could be updated to utilize that.

Related Objects

StatusSubtypeAssignedTask
OpenReleaseNone
OpenNone
OpenNone
OpenNone
OpenFeatureNone
OpenNone
Resolveddr0ptp4kt
DuplicateNone
Duplicate Jhernandez
Duplicatedr0ptp4kt
DeclinedNone
ResolvedJdlrobson
DeclinedNone
InvalidNone
DeclinedNone
Resolvedori
DeclinedNone
Declinedori

Event Timeline

ori raised the priority of this task from to High.
ori updated the task description. (Show Details)
ori added subscribers: ori, Peter, Aklapper and 10 others.

It may still be good enough, but just so that everyone is aware of the limitations of that database:
"The connection type is about 95% accurate in the US. Outside the US, accuracy ranges from 50% to 80%, depending on the country. The data is generally more accurate for countries with more Internet users." (from https://www.maxmind.com/en/geoip2-connection-type-database )

It may still be good enough, but just so that everyone is aware of the limitations of that database:
"The connection type is about 95% accurate in the US. Outside the US, accuracy ranges from 50% to 80%, depending on the country. The data is generally more accurate for countries with more Internet users." (from https://www.maxmind.com/en/geoip2-connection-type-database )

Yep. In a future iteration, we can look into having the client refine its netspeed indication by inferring latency and bandwidth from Resource Timing data. But the data from MaxMind is the best we can do for the initial request.

Scratch that; MaxMind's NetSpeed database classifies all mobile connections as "cellular", making it inadequate for this job. I'll edit the task description to propose a different approach, which is to use geolocation data instead.

I think is great and would be good step forward. One thing I haven't been thinking about is that it would make testing more complex. Functionality testing we could just add flags to choose which to send when developing but how would we do it for editors? And what kind of content will actually be sent for 2G?

Also for real world tests like the things we do with WebPageTest, we need to have two instances (two different IP:s). Also need to think about Catchpoint. Or do just make it too complicated? Lets talk :)

I think we'd want to allow the client to indicate a preference for a particular variant using a cookie. If there is no cookie on the incoming request, we use some heuristic in varnish to determine which variant to serve to the user, and we make that sticky by setting a cookie. If cookie is already set (either by a previous response from varnish, or because the user has expressed a preference) then it is honored by varnish.

And what kind of content will actually be sent for 2G?

No high-res srcsets, for starters!

Change 257496 had a related patch set uploaded (by Ori.livneh):
Split the mobile cache for clients with 'NetSpeed=B' cookie

https://gerrit.wikimedia.org/r/257496

Change 257496 merged by BBlack:
Improve handling of mobile variant cookies

https://gerrit.wikimedia.org/r/257496

This might have been discussed but ...

Why not consider also user agent when splitting versions? If a request comes with opera mini it is a pretty safe assumption to send them to the low banwidth version, right?

Fair point, some browsers are certainly heavily correlated to low bandwidth users. Speaking of Opera Mini specifically, though, don't they already recompress images?

Fair point, some browsers are certainly heavily correlated to low bandwidth users. Speaking of Opera Mini specifically, though, don't they already recompress images?

Yes, but there is no point of sending those users to a heavy js website as js support is very limited.

ori set Security to None.

(Unassigning myself, as the data analysis tasks have been split off into T125414 )

Should we decline this task @ori? Doesn't feel like this will happen since we'd prefer to give the same experience for all.

@Jdlrobson - please forgive me if I'm missing the discussion, but what's
the rationale for providing the same experience for all? People coming on a
2G or other slow connection are already having a drastically different
experience than those of us with faster connections.

There was a talk at Google I/O that talked, in part, about this issue:
https://youtu.be/vaEV8bNi1Dw?t=7m20s

Krinkle subscribed.

T119797 was resolved by removing srcset instead of using qlow, and for all mobile views (which we already fragment html and caches by).