Page MenuHomePhabricator

Add a link: onboarding experiment
Open, HighPublic

Description

In our analysis of the funnel for "add a link", we saw that users who completed the onboarding dialogs were substantially more likely to complete the rest of the workflow and wind up making a link edit. This effect may be due to either of two phenomena or a combination of them:

  • The onboarding content helps users understand how the task works, leading them to success.
  • Users who are attentive and conscientious enough to complete onboarding are the same kind of users that are attentive and conscientious enough to complete the task.

We propose an experiment in which a portion of "add a link" users are not given onboarding at all, and the other portion are not given the option to skip onboarding. From this, we would want to discover:

  • Which group has better completion rates for each stage of the funnel ("link decision", "task review", "task completion").
  • Which group has higher edit volume across all sessions?
  • Which group has higher revert rate on their first session? Across all sessions?
  • Which group is more likely to do a second task after completing their first?
  • Which group has a higher rate of opening the help content during the task?
  • For the group that cannot skip onboarding, how does the bounce rate from onboarding compare to the previous cohort that could skip?

For the group that does not have the option to skip onboarding, we would also want to make this design change: the "don't show again" checkbox would be on the third onboarding screen, not the first. This task to be created as a subtask.

We'll also need to consider how the escape key behaves for each variant, and whether it allows the onboarding dialog to be dismissed.

Event Timeline

@nettrom_WMF -- could you please design this experiment and add your recommended details to the task description? I think you are best positioned to propose the sizes of the group assignments, to estimate how long it should run for, and to know whether we're collecting the data we'll need to answer the questions. You're also welcome to modify or add to the question list.

@RHo -- you mentioned a third alternative that is lightweight onboarding, perhaps a dialog saying where you can learn more. Could you please mock that up and add it so we can decide if it should be part of the experiment?

Also -- from our team discussion, we are concerned that having a group with no onboarding will causes bad edits and community concerns, but on the other hand will give us learning that lasts a long time. Still have to decide.

I have moved this task from the "add a link iteration 2" epic to the "improvements" epic. That's because this onboarding experiment will not be a blocker for deploying to more wikis. But it is still a near-term priority and belongs on our sprint board.

@RHo -- you mentioned a third alternative that is lightweight onboarding, perhaps a dialog saying where you can learn more. Could you please mock that up and add it so we can decide if it should be part of the experiment?

Hi @MMiller_WMF, here's mocks for the 'lightweight' onboarding idea that I think we should test, along with the "Mandatory onboarding" and "No onboarding" variants:

image.png (2×3 px, 561 KB)

As discussed in chat, the proposal would be:

Variant nameDetails
Status quo
No onboardingUsers go straight to the task with zero introduction
Mandatory onboardingUsers must go through the existing onboarding once via three key changes (i) Removing the "Skip all" action from steps 1 and 2; (ii) Moving the "Don't show this again" checkbox to the last step, and (iii) Making that checkbox selected by default
Minimal onboardingUser sees a single simple dialog (non-full screen) with "Get started" as the primary action, whilst "Learn more" opens the status quo onboarding.

Notes from a team discussion:

Notes from a team discussion:

If we stop the add image experiment, we could do something like define multiple new variants:

  • addlink_onboarding_control
  • addlink_onboarding_none
  • addlink_onboarding_full
  • addlink_onboarding_simple

Then we could define the wgGEHomepageNewAccountVariantsByPlatform to assign percentages for each of these variants for desktop/mobile.

Notes from a team discussion:

If we stop the add image experiment, we could do something like define multiple new variants:

  • addlink_onboarding_control
  • addlink_onboarding_none
  • addlink_onboarding_full
  • addlink_onboarding_simple

Then we could define the wgGEHomepageNewAccountVariantsByPlatform to assign percentages for each of these variants for desktop/mobile.

@nettrom_WMF @KStoller-WMF so I think the least time consuming plan would be:

  • stop the add image experiment
  • define multiple new variants for add link as defined above

Otherwise, we would need to do T288022: Support multidimensional user variants in GrowthExperiments, which is probably a medium sized task. How would you like to move forward on this?

@nettrom_WMF @KStoller-WMF so I think the least time consuming plan would be:

  • stop the add image experiment
  • define multiple new variants for add link as defined above

Otherwise, we would need to do T288022: Support multidimensional user variants in GrowthExperiments, which is probably a medium sized task. How would you like to move forward on this?

I'm in support of stopping the Add an Image experiment while we run this experiment with Add a Link. It'll enable us to learn faster since we can assign these groups across all users who get the Growth features. We don't have to turn Add an Image off completely, but I'd propose not having that structured task enabled by default.

Given the similarities in onboarding for both add link and add image, should this experiment encompass both features?

Given the similarities in onboarding for both add link and add image, should this experiment encompass both features?

+1, would be in favour of this!

Moving to Triaged for now: I've discussed this ticket with a few team members, and I'm not convinced the effort is worth the work at this point in time. Once we are ready to focus more on structured task improvements, this or a similar experiment might be worth pursuing.