The way it would work:
- mitmdump records the pageload from the live website on the browser. it'll probably be simpler to have browsertime drive the browser for us
- mitmproxy2mahimahi converts the mitm data into mahimahi format
- connectivity is blocked? (overkill in theory, based on our tests)
- mm-webreplay from mahimahi-h2o replays the run without connectivity, with specific simulated network conditions, using browsertime
- a custom script processes the stats output by browsertime and pipes them to graphite or silimar
- connectivity is restored?
If the standard deviation is as good as in our early tests, this could replace WPT as the reference for synthetic testing of regressions/alerts.