We want to learn more about existing solutions for automated anti-vandalism so that we can make informed decisions for Automoderator.
Nine Wikipedia communities have developed bots which automatically revert edits based on algorithms - generally machine learning models. We want to understand how much of the anti-vandalism burden these bots take on in their communities so that we make informed judgements about the potential impact of Automoderator.
Project | Bot |
---|---|
en.wiki | ClueBot NG |
es.wiki | SeroBOT |
fr.wiki & pt.wiki | Salebot |
fa.wiki | Dexbot |
bg.wiki | PSS 9 |
simple.wiki | ChenzwBot |
ru.wiki | Рейму Хакурей |
ro.wiki | PatrocleBot |
Questions
- In each of the Wikimedia projects above, how many reverts does their anti-vandalism bot make per day, on average?
- What is this as a percentage of all reverts made within 24 hours of an edit occurring?
Expanded scope:
Initial data shows that on average 10% of the bot reverts are reverted back. However, it may be that the vandal actors whose edits had been reverted were reverting them back. So we want to investigate further here, to understand the following:
- How of many of the reverts are possibly vandalism? Not all reverted edits are necessarily vandal edits. While there is no direct way to determine this, exploring revert rate by user segmentation might be helpful comparative data (registered vs. anonymous, split edit buckets etc.)
- Who are reverting the reverts of the anti-vandal bots?
- PatruBOT was the first anti-vandal bot on Spanish Wikipedia before SeroBot. Analyse the false positive rate of PatruBot before it was shutdown