Stats: http://labels.wmflabs.org/stats/arwiki/30
Contact: @Ghassanmas
- Announce the campaign
- Status update no. 1 @ 40%
- Status update no. 2 @ 80%
- Status update no. 3 @ 100%
- File task for building models based on labeled data.
Stats: http://labels.wmflabs.org/stats/arwiki/30
Contact: @Ghassanmas
Status | Subtype | Assigned | Task | ||
---|---|---|---|---|---|
Resolved | awight | T192498 Deploy ORES advanced editquality models to arwiki | |||
Resolved | Halfak | T189710 Train and test damaging/goodfaith model for arwiki | |||
Resolved | awight | T131669 Complete edit quality campaign for Arabic Wikipedia |
Hi @Ghassanmas! It looks like this campaign has stalled out at 53%. I think it's time to give another status update. Would you be willing to post one?
As I see, that @Ghassanmas inactive since more than 1 year (here and on Wikipedia project).
Hi @alanajjar! Thanks for chiming in. Right now, I think the labeling campaign needs another announcement / status update. I'd suggest posting somewhere visible on Arabic Wikipedia to say, "We're more than half-way done with getting training data to ORES -- the machine learning system that will help us find vandalism and other types of damaging edits easier. We need help to finish tagging a random sample of edits as damaging or not and goodfaith or badfaith. ORES will use this to learn what damage/goodfaith edits look like. See https://labels.wmfalbs.org/ui/fawiki. Request a "workset" and start labeling!"
If you view http://labels.wmflabs.org/stats/arwiki/30, you can see who has been doing the labeling work so far and ping those editors specifically to ask them to come back and do more work. I see that @Ghassanmas has done a very large amount of work there. It's a shame he's gone. That work is very valuable.
Hello everyone I am sorry for not being active that long! I had some personal issues, also I was away from keyboard and busy for quiet long time.
@Halfak I almost forget about the campaign, thanks for reminding, I would defiantly find time to back on track, as well as to participate in other stuff.
@Halfak 96 edits left to label, but all of them seems to point to a deleted or unavailable edit, given all of the 50 edits in the requested sample returned this message or similar one " code: revision not found ". So I guess when an edit is aborted it may show again when requesting new sample of edits..