To measure New Filters beta tool usage, we will pull basline figures from before the beta release and compare them to recent stats among beta users.
What we want to know
- Tool usage profile: Are people using more tools per session or per capita than before? What filters and other tools (e.g., highlighting) do reviewers use most often? (I suppose we could simply rank the tools selected, most to least popular?) How does this compare to pre-beta?
- Highlighting usage: What are the top qualities that people choose to highlight? What proportion of beta users (sessions?) employ highlighting?
- Page popularity (sessions): Can we establish a valid metric for and then a baseline stat for something we might call "page sessions." (E.g., a period of RC page and other page use that could be said to be terminated if the user does not return to the RC page for 30 mins.) [ROAN HAS AN IDEA FOR HOW TO DO THIS CRUDELY]
- Tool engagement: What proportion of "page sessions" use only default settings vs .those that involve tool selections? (If this goes up with the new system, we can conclude the interface has made the tools more accessible). [ROAN HAS AN IDEA FOR HOW TO DO THIS CRUDELY]
- Session length:
Another traditional measure of engagement is length of session, the theory being that if users like the tool they will use it longer.
- How many distinct users use the RC Page?
- What wikis should we study: En.wiki, Czech, he.wiki, de.wiki, fr.wiki
- What period: 2 weeks, to avoid what look like day-of-week fluctuations.
- When: before the beta release and as recently as practical.
- Who: On the theory that beta users are a more advanced group, it would be ideal if we could study before and after results for the same group of people.