Analyze Survey results and look into : commonalities/observations of respondents to Data Quality interviews, Interrogate the data and prepare findings.
Then share findings with the team for feedback and further action (such as refinement, data quality discussion with product teams, make Data-QA checklist phab task, publish to Wiki page etc.)
Description
Status | Subtype | Assigned | Task | ||
---|---|---|---|---|---|
Resolved | Mayakp.wiki | T235802 Data Quality Survey Analysis and Presentation : Parent task | |||
Resolved | Mayakp.wiki | T241271 DQ Survey Presentation to Product Analytics team | |||
Resolved | Mayakp.wiki | T241273 Data Quality Survey Presentation to Product team | |||
Resolved | Mayakp.wiki | T247880 Data Quality Presentation : Editing Team | |||
Resolved | Mayakp.wiki | T247883 Data Quality presentation : Readers Web Team |
Event Timeline
Presented a quick high level overview of different responses to PA QA Assessment Round 1 during the Team sharing meeting on 10/24. Next will work on creating a formal presentation of this for defining QA process, further discussions with different teams like product, analytics, etc. and the support we need from them.
Presented QA Assessment Round 1 findings to the Product Analytics team on 10/30. Next steps: Add the implementation slide, incorporate feedback from everyone, plan for presentation with different product teams and the Analytics team
Discussed with Kate and Max Binder about shipping this around:
- QA & Analysis guidelines
- Proposed: Maya has meetings with individual teams, but how do we turn this into a process?
- Stick approach: not do stuff when they don’t follow the workflow
- Carrot: deliver more quickly when they do follow the workflow
- Process from a team perspective: more likely to adopt when they come up with the solutions
- Keep it light - we’re making this change, here’s why
- Consider piloting with particular product teams, create an alley
- Do everything that’s in your control and nothing beyond that
- Next
- Keep it small
- Try to find a partner team to pilot
QA process
- https://phabricator.wikimedia.org/T226564
- Integrate QA with the Instrumentation Checklist
- Specify why new data is needed
- Define research questions, metrics, instrumentation data
- Instrumentation specification ticket and schema definition
- Schema privacy review
- Schema creation with backlink to ticket
- Instrumentation, pipeline scripts (if needed), alerting & monitoring
- QA Testing
- Update data dictionary if needed
- Activate production logging
- Data verification and fixes
- Monitor results
- Long-term Support (LTS)
- Decommissioning
Links for phab task template:
https://tools.wmflabs.org/phabulous/
https://phabricator.wikimedia.org/project/view/67/
SDC
https://phabricator.wikimedia.org/project/view/4223/
Check out left side Task templates
https://phabricator.wikimedia.org/maniphest/task/edit/form/43/
https://phabricator.wikimedia.org/transactions/editengine/maniphest.task/view/65/
https://phabricator.wikimedia.org/project/view/5/
https://phabricator.wikimedia.org/maniphest/task/edit/form/59/?title=[Bug]+Example+title&projects=reading-web-backlog&description=%3D%3D%20Steps%20to%20reproduce%0A%23%20E.g.%2C%20login%20and%20visit%20https%3A%2F%2Fen.m.wikipedia.beta.wmflabs.org%2Fwiki%2FBarack_Obama%3Fdebug%3Dtrue%20on%20the%20MinervaNeue%20mobile%20site%20with%20debug%20mode%20enabled.%0A%23%20%0A%23%20%0A%0A%3D%3D%20Expected%20results%0A-%20Description%20of%20what%20should%20have%20happened.%0A%0A%3D%3D%20Actual%20results%0
We have presented the Data Quality process with PA team members many times over the years. It has also been adapted by several product teams.
I do not have any current plans of doing a presentation with other teams. We need to see how Data Quality process pans out with the work that Metrics Platform is doing.