Page MenuHomePhabricator

Drop random indexes of sites table in production
Closed, ResolvedPublic

Description

Deployment of T342856: Drop random indexes of sites table

  1. ALTERs to run: https://gerrit.wikimedia.org/r/c/mediawiki/core/+/995037/2/maintenance/archives/patch-sites-drop_indexes.sql
    1. Maybe add "optimize table sites;" just for the fun
  2. Where to run those changes: all.dblist
  3. When to run those changes: any time
  4. If the schema change is backwards compatible: Yes
  5. If the schema change has been tested already on some of the test/beta wikis: beta cluster is running with the new schema
  6. if the data should be made available on the labs replicas and/or dumps: no change of the existing rules

Sites table has only 1000 rows and barely gets written (or read for that matter). I'd say just run it on master with replication.

DROP INDEX IF EXISTS site_type ON /*_*/sites;
DROP INDEX IF EXISTS site_group ON /*_*/sites;
DROP INDEX IF EXISTS site_source ON /*_*/sites;
DROP INDEX IF EXISTS site_language ON /*_*/sites;
DROP INDEX IF EXISTS site_protocol ON /*_*/sites;
DROP INDEX IF EXISTS site_domain ON /*_*/sites;
DROP INDEX IF EXISTS site_forward ON /*_*/sites;

Progress

  • s1
  • s2
  • s3
  • s4
  • s5
  • s6
  • s7
  • s8
  • labtestwiki

Event Timeline

Marostegui triaged this task as Medium priority.Feb 2 2024, 6:12 AM
Marostegui moved this task from Triage to Ready on the DBA board.
Marostegui subscribed.

If we want to run it directly on the master, let's add "IF EXISTS"

Mentioned in SAL (#wikimedia-operations) [2024-02-05T06:54:28Z] <marostegui> Drop indexes on site table on s8 T356417

Not worth the optimize, the tables are tiny (320K in enwiki or wikidata)

Mentioned in SAL (#wikimedia-operations) [2024-02-05T06:56:19Z] <marostegui> dbmaint Drop indexes on site table on s8 T356417

Not worth the optimize, the tables are tiny (320K in enwiki or wikidata)

Ah my bad. The table is the same across all wikis. So in s3 it would make a small dent but not in any other section

Going for s3 now. This will take several hours as a sleep is placed in between wikis to avoid replication lag.

Marostegui updated the task description. (Show Details)