Right now AbuseFilter provides per filter stats on mean run time and number of conditions utilized (or it would if it wasn't horribly broken per T53294). I would like to propose adding stats on max run time / conditions to help identify filters that are sometimes very slow but only rarely so. Since most filters have a series of fast early selections criteria (e.g. checking article namespace or not autoconfirmed user) it isn't always obvious how fast or slow the guts of the filter may be. Recording the longest run time in an observed sample would help to identify filters that occasionally perform badly. The present system tracks the mean properties of the last 10,000 actions, and also adding the max values to the statistics would not be difficult.
|Open||None||T209023 Major overhaul for AbuseFilter|
|Open||Daimona||T193064 Create a dedicated page for stats|
|Open||Daimona||T191428 Enhance filter profiling|
|Resolved||Daimona||T191430 Re-enable profiling for stashed edits|
|Resolved||Daimona||T191039 Re-enable filter profiling on every wiki|
|Resolved||Daimona||T53294 Overhaul internal profiling system|
|Open||None||T194593 Rework AbuseFilterPager to allow more flexible sorting|
|Open||Daimona||T87862 Show AbuseFilter average run time and used conditions in main table|
|Resolved||matej_suchanek||T71492 Edits which hit the condition limit should be tagged|
|Open||None||T90669 Add a total run time limit to AbuseFilter|
|Open||Daimona||T90754 Allow tracking of max run time / conditions consumed in AbuseFilter|
An important note which I'm putting here as a reminder: if we merge the patch for T53294 and let it live and register even a single edit, when these stats will be added some errors will start to pop up due to missing data in old entries. A solution could be to delete all stash keys, set AbuseFilterProfileActionsCap to 0 (no profiling), deploy this and then re-enable profiling.