In T372912#10336684, the Search team highlighted that large search index updates can be ~40x bigger than usual for Commons, and asked if this can be understood and optimized.
The intuition is that big updates happen when lots of weighted tag scores are different.
Possible solutions:
compute the delta after stripping the score from weighted tags- round the score to reduce variation at compute time, i.e., round(x / 10) * 10 or (1 + floor(x / 10)) * 10 - we decided to opt for this one
- the delta computation is left intact
Then, talk to Search to ingest the initial big delta.