Page MenuHomePhabricator

Thank You page experiment: First week analysis of English Wikipedia experiment
Closed, ResolvedPublic

Description

User story:

As a English Wikimedian, I want to understand the impact of the Thank You page experiment on English Wikipedia, so that I can decide is we should continue this experiment.

Documentation:

https://www.mediawiki.org/wiki/Extension:GrowthExperiments/Technical_documentation/Campaigns/Creation_of_customized_landing_pages

Background:

Previous similar task: T331495: Quick Analysis: Thank You Pages: custom account creation pages for sv, it, ja, fr, nl
Previous research: Newcomer Experience Pilot Project- Thank You Pages and Thank You Banners

Acceptance Criteria:

Review accounts created with the campaign parameter typage-6C-en-2023.

Provide the following metrics:

  • Page views
  • Unique visitors
  • Registrations
  • Registration %
  • Activations
  • Activation %

Ideally we can look at Page views and registrations ~24 hours after release, and then look at activation and other metrics later in the week.

Event Timeline

KStoller-WMF created this task.
KStoller-WMF moved this task from Inbox to Backlog on the Growth-Team board.
KStoller-WMF added a subscriber: spatton.

I've run the numbers on the currently available data^1 to get a sense of where we're at. Overall, we get the following counts and proportions of page views, "unique visitors" (based on combinations of IPs and user-agents for page views of the account creation page), registrations, and activations.

Page viewsUnique visitorsRegistrationsRegistration rateActivationsActivation rate
1,2151,07540938.0%5413.2%

We do not have a large enough dataset to publish more fine-grained analysis (ref our data publication guidelines). What I can say is that we're looking at a roughly 60/40 split with mobile web being the more used platform, and similar registration and activation rates for both platforms.

Edit I: Adding Footnote 1: "currently available data" means that I haven't done any truncation of the data to ensure all accounts have equal potential for completing any steps of the funnel, as that would require us to wait four days to get the first day's worth of data. There's a 24-hour window for activation to happen, and an additional 48-hour window for any activation edit to be reverted. In this case, we prioritized getting early indicators in favour of formality. When we grab data for a final analysis, we'll ensure that enough time has passed.

Edit II: Due to a data issue, we didn't correctly identify reverts in this data. Hence I've struck out the note about the 48-hour window for reverts, and note that "activations" are based on "all edits across all namespaces, reverted edits included".

Great, thanks for the quick update!

The registration rate is slightly lower than we saw with previous experiments in other languages, but I don't think it's low enough to be concerned.

And that activation rate looks comparable to what we've seen previously. It's lower than the average activation for "organically" created new accounts, but I think that's expected for accounts created in this way: it's far more likely that the new account holder is just curious and/or might edit at a later time (outside of the 24 hour "activation" window).

I've collected data to understand the effects of this campaign after about one week. During the data gathering for this update, I discovered a data quality issue in one of the underlying datasets that rendered us unable to correctly identify reverts (this has been filed as T352899). I was able to work around this by modifying the query that collects editing data.

Using pageview and registration data for the first week, from deployment on 2023-11-28 until the end of 2023-11-04, with editing data up to 2023-11-06 13:00 UTC (see notes below about how this affects reverts), we get enough data to show platform-specific counts as well as the overall total. This gives us the overview table shown below:

PlatformPage viewsUnique visitorsRegistrationsRegistration rateActivationsActivation rate
Desktop1,7731,54855736.0%6812.2%
Mobile web2,5092,27786437.9%799.1%
Totals4,2823,8251,42137.2%14710.3%

In this case, "activation" means making an edit within 24 hours of registration that is subsequently not reverted within 48 hours. Because our dataset does not contain the complete day of 2023-11-06, there are eleven hours missing, it does not allow for all edits to have the full 48-hour revert window.

Taking that caveat about reverts into account, we also estimated what the revert rate of the edits made by these users during their first 24 hours on the wiki. Looking across all edits, with the knowledge that contribution amounts vary greatly between users, the revert rate is 7.1% out of <300 edits. The rate is lower on desktop (5.1% out of <150 edits) than on mobile web (8.9% out of <150 edits).

Lastly, we calculated what proportion of these edits were Suggested Edits, and overall it's 61.5% out of <300 edits. This rate is again lower on desktop (52.6% out of <150 edits) than on mobile web (69.9% out of <150 edits).

I believe we can consider this task resolved, as we have provided first day and first week metrics, and we have a separate task to follow up in January after the end of the banner campaign: T352900: Thank You page experiment: Final analysis of English Wikipedia experiment

KStoller-WMF renamed this task from Thank You page experiment: Analysis of English Wikipedia experiment to Thank You page experiment: First week analysis of English Wikipedia experiment.Jan 5 2024, 5:03 PM