Page MenuHomePhabricator

Provide a way for "bot" patch providers (like LibIp or lsc) to be low-priority, and for zuul to try to merge them later after human-C+2'ed patches
Closed, DeclinedPublic

Description

Suggested in the context of https://gerrit.wikimedia.org/r/c/mediawiki/extensions/WikibaseCirrusSearch/+/1129558 – ~200 patches submitted slowly to not overwhelm CI, but were then mass-C+2'ed by a number of people. At peak there were ~50 "bot" patches in the gate-and-submit queue, crowding out 'real' patches that were higher priority.

Normally we just try to do these runs on weekends to avoid clashing with real use, but that (a) doesn't work always, and (b) doesn't scale.

Event Timeline

hashar added subscribers: Esanders, thiemowmde, Umherirrender and 2 others.

This is not possible in Zuul, the queue is shared between users and it does not have a concept of changes priority.

Looking at https://gerrit.wikimedia.org/r/q/topic:%22bump-mediawiki-requires%22 the changes got mass approved by @Esanders , @Reedy, @thiemowmde, @Umherirrender among others and I would expect all of them to know mass CR+2 causes a large queue to build up in Zuul and overload CI. But I can't blame them either for doing the approvals and not throttling their reviews.

For that specific set of changes (bumping the required MediaWiki version in extension.json), LibUp should send them with Code-Review +2 already approved much like how l10n-bot does and the test pipeline reject those (T357080 : reject-approval: [ code-review: 2 ]). So the review would be done in LibUp and then once valid the batch is send throttled.