Goal
Increase the percentage of newcomers interacting with the and choosing articles from the suggested edits module, and ultimately increase the percentage of newcomers saving edits via the workflow. This distinction is important because it is theoretically possible for us to come up with designs that grab more people into the workflow, but don't give them context to know how to complete the workflow.
Background
When newcomer tasks was first introduced to the newcomer homepage, we conducted our first variant test: T238888: Variant tests: "initiation" test (A vs. B)
- Variant A newcomers had to "initiate" the suggested edits module on their homepage by selecting a call to action in the Start module and proceeding through two onboarding overlays that introduced them to the content.
- Variant B saw the suggested edits module right from their arrival ('pre-initiated') and didn't see the onboarding overlays.
Hypothesis
The first variant test taught us that a more prominent module can attract more interaction. But we also think it showed that onboarding (introducing the concept of "suggested edits") is an important part of converting newcomers who interact with suggested edits, to saving a suggested edit task. By combining the best of the two previous variants where users are both taken through onboarding (Variant A) and have prominent suggested edits content by default (Variant B), more newcomers will use suggested edits.
Takeaways from Var A vs B that inform this Var C vs D test:
1. More prominent SE module increases interaction.
This finding applied as best practice in variants C and D. Both focus on driving editing as the core goal of the newcomer experience. As part of this, the Start module will be removed to center onboarding and orientation on Suggested edits instead.
2. Onboarding is also important.
Despite variant B causing interactions to double, navigating only went up by 60%, and clicking only went up by 30%. It is inferred that whilst people seeing the pre-initated feature try it out, these ‘uninformed’ interactions cause more drop outs. This is where the onboarding info in variant A seems to play an important role in user engagement. This interpretation is supported by indicative results (i.e., not measured for statistical relevance) further down the funnel (edit attempts and completed edits are higher on var A).
3. Mobile SE preview design should be more actionable.
Mobile variant B performed slightly worse than variant A, meaning the SE preview card design isn’t encouraging discovery.
Proposal
Test two new variants, Variant C and Variant D, to see whether the emphasis on 'onboarding' content or prominence of the SE call to action leads to higher interaction and editing.
Mocks: https://wikimedia.invisionapp.com/overview/Variants-C-G-SE-top-of-funnel-tests-ck77r3arw00ea012a8bk1bh0v/screens?v=jXSLq0KIm0I3Xnr7YR5nPw%3D%3D
Redlined versions will be shared in the Growth Zeplin board (tagged with Variant-Pre-initiatedSE)
Note: the variants for mobile and desktop are not strictly comparable. They are similar in spirit, but can be essentially thought of as four different variants. This Phab task will refer to them as "Variant C-desktop", "Variant D-desktop", "Variant C-mobile", and "Variant D-mobile".
Specifications
- T250331: Variant tests: C-desktop
- T250343: Variant tests: D-desktop
- T250440: Variant tests: C-mobile
- T250451: Variant tests: D-mobile