After "add a link" is deployed for two weeks in our pilot wikis, we will calculate the following leading indicators. We will follow the recommended courses of action if the indicators are problematic, and we will use these indicators to decide if and when we should deploy the feature to the other Growth wikis.
Indicator | Plan of Action |
Revert rate | This suggests that the community finds the Add a Link edits to be unconstructive. If the revert rate for Add a Link is significantly higher than that of unstructured link tasks, we will analyze the reverts in order to understand what causes this increase, then adjust the task in order to reduce the likelihood of edits being reverted. |
User rejection rate | This can indicate that we are suggesting a lot of links that are not good matches. If the rejection rate is above 30%, we will QA the link recommendation algorithm and adjust thresholds or make changes to improve the quality of the recommendations. |
Over-acceptance rate | This might indicate that users aren't actually applying judgment to their tasks, meaning we might want to implement quality gates. (What percentage of users who have a complete session have never rejected or skipped a link? What percentage of users who have five or more complete session have never rejected or skipped a link? How many sessions across all users contained only acceptances?) |
Task completion rate | This might indicate that there’s an issue with the editing workflow. If the proportion of users who start the Add a Link task and complete it is lower than 75%, we investigate where in the workflow users leave and deploy design changes to enable them to continue. |