Using the technique described here in example 5: https://speedcurve.com/blog/user-timing-and-custom-metrics/
While we know it is a flawed hack, it might still have decent correlation to user perception, which we'll be able to verify with the survey.
Using the technique described here in example 5: https://speedcurve.com/blog/user-timing-and-custom-metrics/
While we know it is a flawed hack, it might still have decent correlation to user perception, which we'll be able to verify with the survey.
| Subject | Repo | Branch | Lines +/- | |
|---|---|---|---|---|
| Insert performance mark after first paragraph of article | mediawiki/extensions/NavigationTiming | master | +51 -0 |
| Status | Subtype | Assigned | Task | ||
|---|---|---|---|---|---|
| Resolved | • Gilles | T165272 Review research on performance perception | |||
| Declined | • Gilles | T184510 Ideas for performance perception studies | |||
| Resolved | • Gilles | T187299 User-perceived page load performance study | |||
| Invalid | • Gilles | T197611 Measure approximate top paragraph timing |
Change 442102 had a related patch set uploaded (by Gilles; owner: Gilles):
[mediawiki/extensions/NavigationTiming@master] Insert performance mark after first paragraph of article
Change 442102 abandoned by Gilles:
Insert performance mark after first paragraph of article
Reason:
Found a clearer explanation here: https://hacks.mozilla.org/2017/09/building-the-dom-faster-speculative-parsing-async-defer-and-preload/
JS execution is blocked on the CSSOM being available - in case JS needs to access styles - and JS execution is parser-blocking. I assumed that browsers were smarter than they are and would parse the JS to find out whether it needs the CSSOM, but that doesn't appear to be the case.
Given how DOM-heavy our pages are, this seems indeed like a fool's errand and the measurement would impact the performance profile significantly.
The hack described by Steve Souders is actually detrimental to performance, which defeats the purpose of measuring anything with it. We'll just have to wait until there's a proper API available to measure time-to-text.