Once the test in T125946 is merged (expected to be merged in the 16:00-17:00 PST SWAT deploy interval), do a complete quality assessment of the survey data being collected.
- Assess the effect of the change introduced by T127980: There will be a 30 min period between 16:00-17:00 PST on 2016-02-25 during which we increase the sampling rate to 1:500. Use that data to assess the extent at which by not showing the survey to DNT users we have been able to close the 30% data loss.
- Compare the distribution of top user agents that interact with the survey with the distribution of user agents from webrequest logs during the same period of time. Given that we are doing random sampling when we show the survey, putting biases for opting in aside, we expect to see roughly the same user agents in EL, at least for the top ~15 user agents. Are we missing a major user agent in EL?
- Check QuickSurveysResponses_15266417 table to make sure all the data we are interested in is being collected, correctly.
- Check if we have responses in the survey spreadsheet for users who have chosen to participate in the survey.
- Check QuickSurveyInitiation_15278946 table for impressions, number of yes, number of no, and the rest of the funnel.
- Ping people in T125946 if the survey needs to be turned off immediately in case there is a critical issue.
- [1]
[1] I think we may have a data loss problem. I took the survey about 15 minutes ago, using the URL parameter override to load the widget from the page I was visiting. I completed both screens of the survey, opting into data collection. My record doesn't show up in log.QuickSurveysResponses_15266417 on analytics-store (and there's virtually no replag). Is the URL override skipping logging or was my record meant to be logged? Do the complete responses with a Yes on the 2nd screen match the EL table so far?