Import x1 on dbstore2001
Closed, ResolvedPublic

Description

x1 needs to be imported into dbstore2001.
It is not as easy as just dumping the tables from one of the x1 hosts, as depending on the wiki, some tables do live on production instead of x1.
See:

T151552#2876669
T151552#2880669
T153638

Restricted Application added a subscriber: Aklapper. · View Herald TranscriptMar 6 2017, 2:35 PM
Marostegui moved this task from Triage to Next on the DBA board.Mar 6 2017, 2:35 PM

Mentioned in SAL (#wikimedia-operations) [2017-03-08T09:39:32Z] <marostegui> Stop replication on db2033 - T159707

In order to start the import I have renamed the tables that aren't supposed to be in use on dbstore2001, that is, all the echo_% tables in all the databases but the following:

mediawikiwiki
metawiki
officewiki
root@dbstore2001:/srv/sqldata# for i in mediawikiwiki metawiki officewiki; do echo $i; ls -lhS $i/echo* | grep ibd;done
mediawikiwiki
-rw-rw---- 1 mysql mysql 1.3G Mar  8 10:07 mediawikiwiki/echo_notification.ibd
-rw-rw---- 1 mysql mysql 156M Mar  8 09:28 mediawikiwiki/echo_target_page.ibd
-rw-rw---- 1 mysql mysql 124M Mar  8 10:07 mediawikiwiki/echo_event.ibd
-rw-rw---- 1 mysql mysql 8.0M Mar  8 07:30 mediawikiwiki/echo_email_batch.ibd
metawiki
-rw-rw---- 1 mysql mysql 1.1G Mar  8 10:06 metawiki/echo_notification.ibd
-rw-rw---- 1 mysql mysql 668M Mar  8 09:54 metawiki/echo_event.ibd
-rw-rw---- 1 mysql mysql 116M Mar  8 10:06 metawiki/echo_target_page.ibd
-rw-rw---- 1 mysql mysql 1.0M Mar  8 00:09 metawiki/echo_email_batch.ibd
officewiki
-rw-rw---- 1 mysql mysql  36M Mar  8 04:34 officewiki/echo_notification.ibd
-rw-rw---- 1 mysql mysql  18M Mar  8 06:12 officewiki/echo_event.ibd
-rw-rw---- 1 mysql mysql 9.0M Mar  8 08:29 officewiki/echo_target_page.ibd
-rw-rw---- 1 mysql mysql 1.0M Mar  7 13:09 officewiki/echo_email_batch.ibd

On the rest of the wikis they have been renamed, example:

+-----------------------------------+
| Tables_in_eswikiversity (%echo_%) |
+-----------------------------------+
| T159707_echo_email_batch          |
| T159707_echo_event                |
| T159707_echo_notification         |
+-----------------------------------+
+----------------------------------+
| Tables_in_eswikivoyage (%echo_%) |
+----------------------------------+
| T159707_echo_email_batch         |
| T159707_echo_event               |
| T159707_echo_notification        |
+----------------------------------+
+---------------------------------+
| Tables_in_etwikibooks (%echo_%) |
+---------------------------------+
| T159707_echo_email_batch        |
| T159707_echo_event              |
| T159707_echo_notification       |
+---------------------------------+

I will leave it like that for a few hours to make sure replication doesn't break, and will start to import those tables from db2033 (x1), ignoring mediawikiwiki metawiki officewiki on db2033

Marostegui moved this task from Next to In progress on the DBA board.

Mentioned in SAL (#wikimedia-operations) [2017-03-09T11:24:32Z] <marostegui> Stop replication db2033 - T159707

After all the checks to make sure we do not break dbstore2001 I am importing x1 into it.

Change 342171 had a related patch set uploaded (by Marostegui):
[operations/puppet] dbstore2.my.cnf: Add replication filter for wikis

https://gerrit.wikimedia.org/r/342171

Change 342171 merged by Marostegui:
[operations/puppet] dbstore2.my.cnf: Add replication ignore filters

https://gerrit.wikimedia.org/r/342171

The replication broke because it is missing some tables?

Mentioned in SAL (#wikimedia-operations) [2017-03-10T15:03:22Z] <marostegui> Stop slave db2033 for maintenance - T159707

The replication broke because it is missing some tables?

As we spoke, that was due to a bad regex where it matched test2wiki and excluded it from the MySQL dump, that is the reason it broke. The new import is now half way

x1 is now replicating on dbstore2001. We will see if it breaks again (hopefully not!) - I will leave it replicating during the weekend.

Marostegui closed this task as Resolved.Mar 13 2017, 9:46 AM

This looks good and have been working fine since Friday evening.