Page MenuHomePhabricator

Analytics for Reward Interstitials
Closed, ResolvedPublic

Description

Reward interstitials are connected to the users profile stats and are shown across all Suggested edit tasks (article descriptions, image captions, image tags).

We want to A/B test, we want to know if motivational elements increase session length, do they increase edits per session?


04/10/2020:
Shay is outlining an A/B/C testing schema and will be working with @Dbrant who will implement a mechanism to compare group users by exposure to features.


Reward Events:
Contributions:
Starting at the 5th contribution, shown after every 50th additional contribution, (5, 55, 105 and so forth)
Edit streak:
Shown when on an edit streak on every 5th day (5, 10, 15 and so forth)
The edit streak count table we currently have wikimedia_editor_tasks_edit_streak is a very rudimentary count of users and days of streaks, we will need a better way to track these users or are we using another mechanism to check for edit streaks?
[[ URL | Edit quality: ]]
Shown every 14 days when revert rate is “Perfect“, “Excellent“, “Very good“ or “Good“, when user has actively contributed in the past 14 days (at least one edit)
[[ URL | Page views: ]]
Shown once a month when user has actively contributed in the past 30 days (at least one edit)

Related Objects

StatusSubtypeAssignedTask
Resolvedscblr
Resolvedcooltey
ResolvedSNowick_WMF

Event Timeline

LGoto triaged this task as Medium priority.Apr 13 2020, 4:41 PM
LGoto edited projects, added Product-Analytics (Kanban); removed Product-Analytics.
LGoto moved this task from Next 2 weeks to Doing on the Product-Analytics (Kanban) board.

@Charlotte this isn't included on the 'Needs Analytics' column, might need to be moved there. As of today we have 701 users in the test pool, 430 in GroupA and 271 in GroupB. Also want to verify with @Dbrant which group are reward recipients.

Charlotte lowered the priority of this task from Medium to Low.Jul 15 2020, 4:16 PM
Charlotte lowered the priority of this task from Low to Lowest.Aug 10 2020, 2:02 PM
Charlotte raised the priority of this task from Lowest to Low.Aug 27 2020, 6:47 PM
SNowick_WMF renamed this task from Measure Reward Interstitials to Analytics for Reward Interstitials.Oct 26 2020, 10:16 PM

@Dbrant Checking in to see how long this feature has been active, as of 2020-10-26 we have a pretty small sample of users:

userstest_group
703suggestedEditsInterstitial_GroupA
687suggestedEditsInterstitial_GroupB

Since this feature is reaching a highly active set of editors these counts may be as expected, just want to be sure before I start analysis.

SNowick_WMF raised the priority of this task from Low to Medium.Dec 17 2020, 10:19 PM
SNowick_WMF moved this task from Backlog to Kanban on the Product-Analytics board.

Results from 89 day data set show very little difference between Group A, which was presented with reward interstitials, and Group B which was not.

Session frequency and length:

test_groupaverage minutes per sessionsessions per user
suggestedEditsInterstitial_GroupA20.1837.8
suggestedEditsInterstitial_GroupB20.0138.2

Editing frequency:

test_groupaverage edits per usermedian edits per user
suggestedEditsInterstitial_GroupA63.0114
suggestedEditsInterstitial_GroupB62.5714

Data spreadsheet

Thanks @SNowick_WMF ...

Results from 89 day data set show very little difference between Group A, which was presented with reward interstitials, and Group B which was not.

average minutes per session and average edits per user are both higher, so I assume that adding the interstitials to the feed doesn’t hurt. What is the p-value for these results? If the result is significant (decent p-value of ≤ 0.05), we’ll take this minor win.

Data spreadsheet

Requested access for the doc.