We have defined our research questions, measurements, experiment plan, and leading indicators in the "[[ https://docs.google.com/document/d/1gp7KpgZdulA-fdHDsZrBkMrkys4IgzD1-WEhj29bQjc/edit?ts=601b1cbd# | Add a Link - Measurement Specification ]]". This task contains the full set of instrumentation needs, which are also copied into "instrumentation" sections on the relevant frontend tasks.
---
This instrumentation will exist both on the desktop and mobile versions of the feature. For users who have the Homepage enabled, we want to capture their user ID along with the information below whenever they interact with an “add a link” task.
For all measurements, we will want to:
* Distinguish by wiki.
* Analyze how much time elapses between each action.
* Know whether each action was taken from desktop or mobile.
* Know which article the user is on (through page title and page id).
* Include timestamps with millisecond precision.
**Existing instrumentation**
The instrumentation that already tracks the user’s behavior on the newcomer homepage and in the suggested edits module should continue in the same way. The only difference should be that “add a link” should be a new task type that can be distinguished from the legacy link task.
Likewise for the help panel, which will contain guidance specific to “add a link”, the instrumentation should work the same way as for legacy suggested edits, except it will include an indicator for this new task type.
Likewise for the post-edit dialog, which should use the same instrumentation as it already does, except including an indicator for this new task type.
**Onboarding**
{T278111}
**No suggestions available**
{T278112}
**AI suggestions mode**
{T278114}
**Edit mode toggle**
{T278115}
**Link inspector**
As the user proceeds through the task, they will use the “link inspector” dialog. We need to record events for these actions:
* An impression for when the link inspector focuses on a link.
* Selecting the left arrow to go back.
* Selecting the right arrow to skip forward.
* Selecting “yes”.
* Selecting “no”.
* Opening up the blue link to view the full article of the link target.
* Reopening the rejection dialog via the ellipsis link.
* Mobile only: tapping the “?” to open the help panel.
For each of those events, we want to include:
* Which suggestion in the series they are on (e.g. “2/7”)
* The anchor text for the link suggestion.
* The target article for the link suggestion.
* The probability score for the link suggestion.
If the user changes their response on a given link, that should be recorded as an additional event, but we do not need to indicate that it was a change.
We also need to record an event for when the user selects a link suggestion in the body of the article by tapping or clicking on it.
**Rejection dialog**
After a user selects “no” on a suggestion, the rejection dialog opens. We need to record:
* An impression for the dialog opening.
* An event for when the dialog closes, along with the selection the user made in the list of options. We do not need separate events as the user taps around on different options -- only their final selection.
**Edit summary**
We need to record when the user attempts to advance to the edit summary screen, which can be done either through selecting the explicit “Publish/Submit” button or by advancing/skipping from the last link suggestion. When they do this, we should record which route they took.
If the user has skipped all suggestions, they’ll get the “skipped all suggestion dialog”. We should record:
* Impression for the dialog.
* An event for which button the user selects.
If the user has not skipped all suggestions, they’ll go to the edit summary screen. We need to record:
* Impression for the screen, along with counts for its contents, e.g. “7 accepted, 2 rejected, 1 skipped”
* An event when the user selects the option to “Review your changes”
* An event when the user selects the arrow to return to the article.
* An event when the user selects the Publish/Submit button.