== Background
We would like to run an A/B test on the page issues feature for two weeks. The test will display the new page issues code to one group and the older version to the remainder (as per the bucketing code built in {T193584} and the instrumentation built in {T191532}. We will collect data for 2 weeks, plus a run-up time of 1-2 days to limit caching and novelty effects. A separate task will be set up for turning the test off. Suggested sampling ratio: 20% (per T200792#4489268)
== Questions we are trying to answer
Does the new treatment for page issues increase the awareness among readers of page issues?
- Is there an increase in clickthrough based on the new issue treatments (from the article page to the issues modal, from the issues modal to anywhere else - details about issues type, modal dismissed, etc, i.e. where do people go after the modal)?
- Does clickthrough depend on the severity of each issue?
- Do mobile edits increase with page issues as referrer?
- What is the approximate percentage of (mobile) pageviews to pages with issues on (select languages)?
== Acceptance criteria
[x] Notify the Analytics Engineering team as described [[https://wikitech.wikimedia.org/wiki/Analytics/Systems/EventLogging/Data_retention_and_auto-purging#Black-listed_schemas |here]] to have both ReadingDepth and PageIssues blacklisted from being stored in MariaDB (see T200792#4489268).
(done, see corresponding AC in T191532)
[-] Enable the A/B test on a single wiki (Latvian Wikipedia, e.g.https://lv.wikipedia.org/wiki/Ropa%C5%BEu_sporta_centrs) at a sampling ratio of 100%: T204609
[-] Run for 2-3 days and check the data coming in (TBC by Olga) - checks made on Latvian wiki
[] Roll out page issues to the target projects (Catalan, Japanese, Russian, English) - starting at an extremely low sampling rate... 0.01%
[] Wait a few days to get confirmation from Tilman that the A/B test instrumentation on the small sample data coming in is working as expected.
[] Finish the roll out using several deploy windows in a single day - take care rolling this out in case there are any unexpected spikes in errors or event logging:
[] Progressively roll out page issues to the target projects at 5%
[] Progressively roll out page issues to the target projects at 10%
[] Set A/B test for page issues for select projects at the A/B test sampling ratio of 20% (per T200792#4489268)
Tentative dates:
- Latvian wiki: sept 19
- English, Russian, Japanese, Catalan: oct 1
== Sign off steps
[] Wait a few days to get confirmation from Tilman that the A/B test instrumentation working as expected.
[] If necessary, create follow-up tasks (for bugs, inconsistencies, etc)
[] Setup task to analyse A-B test
[] Prepare deploy/remove code task that will be carried out based on A-B test results.
[] Add a note to the project page
[] Add a note to the [release timeline](https://www.mediawiki.org/wiki/Reading/Web/Release_timeline)
== TODO
1. Use @phuedx or @pmiazga's "bucket breaker" script to find a session ID that'll put you in the correct bucket for testing
- https://gist.github.com/phuedx/d580f01c501d207398828b717bf9870b