Unlike previous interventions by the Growth Team, we do not plan to run Newcomer Tasks as a straightforward A/B-test to see if it improves activation and retention. This is partly because we see this feature as an incremental change to the Homepage itself, and we have already been running an A/B-test with the Homepage to determine its effect on activation and retention. It is also because we will be able to learn how to improve newcomer tasks if we can test multiple variants of the feature.
Therefore, for Newcomer Tasks we plan to give the intervention to a large proportion of new users and keep a small control group, in what we will refer to as an A/B/C test. In this configuration the A and B groups will both get the Newcomer Tasks intervention, but different variants of it (some possible variants are described below). The C group is our control group, which will not have access to the Homepage by default. The proposed distribution of these groups is 40% / 40% / 20%.
This task is to make it possible to experiment with multiple variants simultaneously, and to record and manage which users are receiving which variants in order to facilitate analysis.
The specifications for how we want to run these tests is listed in this section of the newcomer tasks measurement plan.