commonswiki_file shards have crossed our 50GB threshold in codfw and cloudelastic.
Change primary shard count in mediawiki-config and deploy it, then kick off reindex
commonswiki_file shards have crossed our 50GB threshold in codfw and cloudelastic.
Change primary shard count in mediawiki-config and deploy it, then kick off reindex
Subject | Repo | Branch | Lines +/- | |
---|---|---|---|---|
cirrussearch: increase up commonswiki_file shards | operations/mediawiki-config | master | +4 -4 |
Change 628980 had a related patch set uploaded (by Ryan Kemper; owner: Ryan Kemper):
[operations/mediawiki-config@master] cirrussearch: increase up commonswiki_file shards
Increasing the number of shards for commons wiki is starting to be an issue. We need a better strategy.
Change 634381 had a related patch set uploaded (by Ryan Kemper; owner: Ryan Kemper):
[operations/puppet@production] Bring 3 new eqiad wdqs nodes into service
Change 628980 abandoned by Ryan Kemper:
[operations/mediawiki-config@master] cirrussearch: increase up commonswiki_file shards
Reason:
we ended up just bumping the limit, so there is no need to re-shard this index anymore
Closing this ticket because we ended up changing the alert thresholds which removes the need to re-shard the index
Housekeeping note: see https://phabricator.wikimedia.org/T265908 for the patch that changes the alert thresholds, which should clear the alerts that led to this ticket originally being created.
Shard limit temporarily increased: https://gerrit.wikimedia.org/r/c/operations/puppet/+/650021
Will re-index after the new year
@RKemper I'm assigning this task to you, since it seems you're acting on it. Please remove yourself as assignee if that's not ok.
We could probably cancel this? In T271493 we are fixing the data size issues which will remove the need to re-shard.