Page MenuHomePhabricator

[Investigate] Wikidata revert model's precision and recall (filter rate)
Closed, ResolvedPublic

Description

Question: What proportion of human-edits will need to be reviewed if we want 95% recall?

Methods:

  1. Gather random sample of edits
  2. Label reverted
  3. Explore dataset of non-reverted damage

Event Timeline

Halfak raised the priority of this task from to Needs Triage.
Halfak updated the task description. (Show Details)
Halfak moved this task to Backlog on the Machine-Learning-Team (Active Tasks) board.
Halfak subscribed.

Started some work here. This is based off of a random sample of wikidata edits

https://etherpad.wikimedia.org/p/revscoring_wikidata_reverted_set

OK. If we draw the cutoff at 0.93, we'll catch 100/10000 edits and that will account for (as far as we can tell) all of the vandalism!

Halfak set Security to None.
Halfak moved this task from Backlog to Completed on the Machine-Learning-Team (Active Tasks) board.