Today we only measure save timing with RUM metrics, it would be helpful to also do synthetic.
Description
Details
Status | Subtype | Assigned | Task | ||
---|---|---|---|---|---|
Open | None | T255502 Goal: Save Timing median back under 1 second | |||
Open | None | T214460 Add synthetic testing for editing process (save timing) | |||
Open | None | T254805 Document Wptuser account on English Wikipedia |
Event Timeline
Had a go today and I could go through the full flow. Now we just need to decide what to measure :)
We probably want to get the metrics that we send to statsv. I'll ping you @Krinkle the coming days.
I've started this again, trying to make a more sane script that works. I can go through the full scenario on beta and we need to find a way to get the important metric(s).
@Krinkle is it anything more than ve.mwTarget.performance.user.saveCompletewe need? What do you think is the best way forward so we can make it available in JavaScript so we easy can get that?
Probably easiest to capture via mw.trackSubscribe. although that will be asynchronous.
Discussed with @Krinkle on our offsite: let us get the metrics in the backend for our user and lets add two tests: One new page in the user space for enwiki where we just add a new date for every save and then in beta we do a save on a large article where change/insert a date in the beginning of the article.
Change 556166 had a related patch set uploaded (by Phedenskog; owner: Phedenskog):
[performance/synthetic-monitoring-tests@master] Save article on enwiki to pickup save timings in backend.
Change 556166 merged by jenkins-bot:
[performance/synthetic-monitoring-tests@master] Save article on enwiki to pickup save timings in backend.
Change 556971 had a related patch set uploaded (by Phedenskog; owner: Phedenskog):
[performance/synthetic-monitoring-tests@master] Correct config name for save timings configuration.
Change 556971 merged by jenkins-bot:
[performance/synthetic-monitoring-tests@master] Correct config name for save timings configuration.
Change 557020 had a related patch set uploaded (by Phedenskog; owner: Phedenskog):
[performance/synthetic-monitoring-tests@master] Add extra wait time when saving.
Change 557020 merged by jenkins-bot:
[performance/synthetic-monitoring-tests@master] Add extra wait time when saving.
This works locally but not on the machine running the tests.
I tried to record a video in the instance like this:
docker run --rm -v /config:/config -v "$(pwd)":/sitespeed.io sitespeedio/sitespeed.io:11.8.1 --browsertime.videoParams.debug true saveTiming.enwiki.js -n 1 --config /config/secrets.json --multi --s3.removeLocalResult false
But since it fails, the video isn't there. I think there are some I should fix for Browsertime, but let try to make the script better first.
Annoying: It works fine running on my machine. Switching to Docker I get another error (but not the same as on AWS). I fixed that and it works locally in Docker. I've made patch in Browsertime so we can take a screenshot when things fails. This is super useful since we can the try/catch and in the catch section take a screenshot of what's going on, that makes it much simpler to debug. Hopefully I can push that on Monday.
Change 557844 had a related patch set uploaded (by Phedenskog; owner: Phedenskog):
[performance/synthetic-monitoring-tests@master] Update to sitespeed.io 11.9.0
Change 557844 merged by jenkins-bot:
[performance/synthetic-monitoring-tests@master] Update to sitespeed.io 11.9.0
Change 558042 had a related patch set uploaded (by Phedenskog; owner: Phedenskog):
[performance/synthetic-monitoring-tests@master] A little more robust way of edit.
Change 558042 merged by jenkins-bot:
[performance/synthetic-monitoring-tests@master] A little more robust way of edit.
Change 598915 had a related patch set uploaded (by Phedenskog; owner: Phedenskog):
[performance/synthetic-monitoring-tests@master] New go at editing an article.
Change 598915 merged by jenkins-bot:
[performance/synthetic-monitoring-tests@master] New go at editing an article.
Change 598950 had a related patch set uploaded (by Phedenskog; owner: Phedenskog):
[performance/synthetic-monitoring-tests@master] Add extra sleep between edit runs.
Change 598950 merged by jenkins-bot:
[performance/synthetic-monitoring-tests@master] Add extra sleep between edit runs.
Change 598978 had a related patch set uploaded (by Phedenskog; owner: Phedenskog):
[performance/synthetic-monitoring-tests@master] Add save article on beta.
Change 598978 merged by jenkins-bot:
[performance/synthetic-monitoring-tests@master] Add save article on beta.
We got this running on edit-save-timings.webperf.eqiad.wmflabs. It runs two tests, one for enwiki and one for beta. It logs in the wptuser, edit the wptuser page by removing the current text with a new timestamp and saves the page: https://github.com/wikimedia/performance-synthetic-monitoring-tests/tree/master/tests/edit/desktop/editScripts
@Krinkle do you think this is enough for us to pickup the save timings in backend or should we have a more complicated test case?
@Peter I left some review comments at https://gerrit.wikimedia.org/r/#/c/performance/synthetic-monitoring-tests/+/598915/.
I think frontend save timing would suffice for now. But, I'm not entirely sure if we measure this well right now in the current test. Before I speculate though, where can I see the result of the current measures? (My main worry is that VE stashing and the synthetic delays might hide too much of what we want to measure. Especially since the page is too simple/small.)