Now that we have the service performing very responsively for most simple queries, we need to optimise for some of our worst cases - in this case users with high numbers of edits. This can cause a large number of similarities and overlaps that need to be tested, which in turn cause large numbers of requests to the mediawiki API, one for each overlapping user. Currently these requests are done sequentially, we need to parallelise these requests.
Description
Description
Status | Subtype | Assigned | Task | ||
---|---|---|---|---|---|
Open | None | T259471 Sockpuppet detection API [low effort] | |||
Declined | None | T272701 Sockpuppet Detection: Parallelise requests to Mediawiki API |
Event Timeline
Comment Actions
@hnowlan: Removing task assignee as this open task has been assigned for more than two years - See the email sent to task assignee on Feburary 22nd, 2023.
Please assign this task to yourself again if you still realistically [plan to] work on this task - it would be welcome! :)
If this task has been resolved in the meantime, or should not be worked on by anybody ("declined"), please update its task status via "Add Action… 🡒 Change Status".
Also see https://www.mediawiki.org/wiki/Bug_management/Assignee_cleanup for tips how to best manage your individual work in Phabricator. Thanks!