Page MenuHomePhabricator

Add a link: leading indicators
Closed, ResolvedPublic

Description

After "add a link" is deployed for two weeks in our pilot wikis, we will calculate the following leading indicators. We will follow the recommended courses of action if the indicators are problematic, and we will use these indicators to decide if and when we should deploy the feature to the other Growth wikis.

IndicatorPlan of Action
Revert rateThis suggests that the community finds the Add a Link edits to be unconstructive. If the revert rate for Add a Link is significantly higher than that of unstructured link tasks, we will analyze the reverts in order to understand what causes this increase, then adjust the task in order to reduce the likelihood of edits being reverted.
User rejection rateThis can indicate that we are suggesting a lot of links that are not good matches. If the rejection rate is above 30%, we will QA the link recommendation algorithm and adjust thresholds or make changes to improve the quality of the recommendations.
Over-acceptance rateThis might indicate that users aren't actually applying judgment to their tasks, meaning we might want to implement quality gates. (What percentage of users who have a complete session have never rejected or skipped a link? What percentage of users who have five or more complete session have never rejected or skipped a link? How many sessions across all users contained only acceptances?)
Task completion rateThis might indicate that there’s an issue with the editing workflow. If the proportion of users who start the Add a Link task and complete it is lower than 75%, we investigate where in the workflow users leave and deploy design changes to enable them to continue.

Event Timeline

Adding Product Analytics to the project tags so we can triage this in board refinement tomorrow.

Chatting with @MMiller_WMF, we'd like to add a fourth metric to capture what proportion of users are accepting all suggested links. We'd want to compare that with the already listed rejection rate, and calculate that rejection rate separate when withholding users who accept all links.

I've drafted a slide deck and handed it off to Marshall for review. Once we've resolved any outstanding issues, I'll write up a summary and resolve this task.

Thanks, @nettrom_WMF! Today, I added comments and questions in the slide deck for you.

The analysis has been done and shared with the team. I uploaded the notebook for the analysis to GitHub. Next steps are sharing these more widely, which I'm leaving in the capable hands of @MMiller_WMF, but I can help where needed. The Growth team had many follow up questions, and we need to create a new task with those questions so they can be prioritized and answered.

The analysis has been updated to categorize users depending on whether they registered before or after deployment of Add a Link, in order to make comparisons between Add a Link and the unstructured link task easier, as well as to make it clearer that existing users were getting Add a Link almost exclusively. This could otherwise have created a lot of confusion, as prolific experienced users would make Add a Link look overwhelmingly good.

The slide deck has been updated with the new results, and they have also been posted to https://www.mediawiki.org/wiki/Growth/Personalized_first_day/Structured_tasks/Add_a_link#Leading_Indicators_and_Plan_of_Action