Page MenuHomePhabricator

Vary mobile HTML by connection speed
Closed, DeclinedPublic

Description

For T119797, we need a fast way to determine the client connection speed in Varnish and either vary the response accordingly (so we serve HTML with low-res images for clients on slow connections) or set a cookie, so that client-side code can rewrite the page to use low-res variants.

We have a subscription to MaxMind's GeoIP2 Connection Speed database, so presumably the varnish GeoIP code could be updated to utilize that.

Related Objects

StatusAssignedTask
OpenNone
OpenNone
OpenNone
OpenNone
OpenNone
OpenNone
OpenNone
DeclinedNone
OpenNone
Resolveddr0ptp4kt
DuplicateNone
DuplicateJhernandez
Duplicatedr0ptp4kt
OpenNone
ResolvedJdlrobson
DeclinedNone
InvalidNone
DeclinedNone
Resolvedori
DeclinedNone
Declinedori

Event Timeline

ori created this task.Nov 28 2015, 7:33 PM
ori raised the priority of this task from to High.
ori updated the task description. (Show Details)
ori added subscribers: ori, Peter, Aklapper and 10 others.

It may still be good enough, but just so that everyone is aware of the limitations of that database:
"The connection type is about 95% accurate in the US. Outside the US, accuracy ranges from 50% to 80%, depending on the country. The data is generally more accurate for countries with more Internet users." (from https://www.maxmind.com/en/geoip2-connection-type-database )

ori added a comment.Nov 28 2015, 8:23 PM

It may still be good enough, but just so that everyone is aware of the limitations of that database:
"The connection type is about 95% accurate in the US. Outside the US, accuracy ranges from 50% to 80%, depending on the country. The data is generally more accurate for countries with more Internet users." (from https://www.maxmind.com/en/geoip2-connection-type-database )

Yep. In a future iteration, we can look into having the client refine its netspeed indication by inferring latency and bandwidth from Resource Timing data. But the data from MaxMind is the best we can do for the initial request.

ori added a comment.Nov 30 2015, 10:43 AM

Scratch that; MaxMind's NetSpeed database classifies all mobile connections as "cellular", making it inadequate for this job. I'll edit the task description to propose a different approach, which is to use geolocation data instead.

I think is great and would be good step forward. One thing I haven't been thinking about is that it would make testing more complex. Functionality testing we could just add flags to choose which to send when developing but how would we do it for editors? And what kind of content will actually be sent for 2G?

Also for real world tests like the things we do with WebPageTest, we need to have two instances (two different IP:s). Also need to think about Catchpoint. Or do just make it too complicated? Lets talk :)

ori added a comment.Nov 30 2015, 10:51 AM

I think we'd want to allow the client to indicate a preference for a particular variant using a cookie. If there is no cookie on the incoming request, we use some heuristic in varnish to determine which variant to serve to the user, and we make that sticky by setting a cookie. If cookie is already set (either by a previous response from varnish, or because the user has expressed a preference) then it is honored by varnish.

ori added a comment.Nov 30 2015, 11:23 AM

And what kind of content will actually be sent for 2G?

No high-res srcsets, for starters!

Change 257496 had a related patch set uploaded (by Ori.livneh):
Split the mobile cache for clients with 'NetSpeed=B' cookie

https://gerrit.wikimedia.org/r/257496

Change 257496 merged by BBlack:
Improve handling of mobile variant cookies

https://gerrit.wikimedia.org/r/257496

Gilles assigned this task to ori.Dec 14 2015, 8:01 PM
Gilles moved this task from Inbox to Doing on the Performance-Team board.
Gilles added a subscriber: Gilles.
Nuria added a subscriber: Nuria.Dec 15 2015, 5:58 PM

This might have been discussed but ...

Why not consider also user agent when splitting versions? If a request comes with opera mini it is a pretty safe assumption to send them to the low banwidth version, right?

Fair point, some browsers are certainly heavily correlated to low bandwidth users. Speaking of Opera Mini specifically, though, don't they already recompress images?

Nuria added a comment.Dec 15 2015, 8:33 PM

Fair point, some browsers are certainly heavily correlated to low bandwidth users. Speaking of Opera Mini specifically, though, don't they already recompress images?

Yes, but there is no point of sending those users to a heavy js website as js support is very limited.

ori added a comment.Jan 11 2016, 6:34 PM

@Tbayer, got a status update?

ori reassigned this task from ori to Tbayer.Jan 11 2016, 6:34 PM
ori set Security to None.
Tbayer removed Tbayer as the assignee of this task.Feb 14 2016, 5:35 AM

(Unassigning myself, as the data analysis tasks have been split off into T125414 )

Restricted Application added a project: Operations. · View Herald TranscriptMay 4 2016, 9:13 AM

Should we decline this task @ori? Doesn't feel like this will happen since we'd prefer to give the same experience for all.

@Jdlrobson - please forgive me if I'm missing the discussion, but what's
the rationale for providing the same experience for all? People coming on a
2G or other slow connection are already having a drastically different
experience than those of us with faster connections.

There was a talk at Google I/O that talked, in part, about this issue:
https://youtu.be/vaEV8bNi1Dw?t=7m20s

Restricted Application added a project: Operations. · View Herald TranscriptAug 11 2016, 2:11 PM
Krinkle closed this task as Declined.Dec 6 2016, 1:11 AM
Krinkle added a subscriber: Krinkle.

T119797 was resolved by removing srcset instead of using qlow, and for all mobile views (which we already fragment html and caches by).