Page MenuHomePhabricator

Evaluate transferring the non-replicated tables to the new toolsdb server
Closed, ResolvedPublic


Four databases are not replicated on the toolsdb service, per

As such during remediation of T216208, the following DBs were not replicated to the new server.


This ticket is to evaluate which ones need to be copied over from the old server with the understanding that tables did crash and some may be corrupt.

Related Objects


Event Timeline

Bstorm triaged this task as High priority.Feb 18 2019, 7:07 PM
Bstorm created this task.

Just saying: we have a testing host where we could try to import those databases from labsdb1005 and see if they fail or what fails during the import process.
Let me know if I can help with this.

We need to wait for user feedback- if they want us to try to recover or if they prefer to do it from their own backups, or start from 0.

Of course :-). Just mentioning this as an idea to Cloud Team

Although I can probably recreate most of the tables in s51290__dpl_p from scratch, there are a few tables I’d like to recover if they turn out to be recoverable. (I’m mobile now but can come up with a list of tables sometime within the next 24 hours).

@russblau that is exactly why we created this task :-D, please send the list when you can (obviously no 100% guarantees of recovery, as the host was mostly non-working).

Thanks. The tables that I'd like to recover from database s51290__dpl_p if possible are as follows:

  • all_dabs
  • all_moves
  • bonus_list
  • contest_dabs
  • disambig_links
  • disambig_titles

Hopefully that's all but I'll let you know if I find any others are needed.

Also, please try

  • mo_articles
  • mo_pl
  • mo_redirect_list
  • mo_template_only_articles

Done. Please in the future create backups of those tables, as we don't guarantee they can be recovered next time (we don't even know if they kept the right data).

Could someone fill me in on why we don't replicate these databases/tables from a technical and operational perspective?

Users were warned about this particularity and details can be seen at task T127164. -- from documentation at People were asked to say they declare they either had backups or could lose all data. This is just not being too hard if there is a chance of recovering some of it.

Would be fine when the data of s51412__data could be saved/restored onto the new server too.

I can recreate it, but this would take about two days.

If the data is lost, maybe you could crate all the table scheme like it was in the crashed server?

And yes, I have seen now T127164 – the original owner is offline and I am maintainer since January 2017 so I was not aware of this.

Bstorm lowered the priority of this task from High to Medium.Feb 21 2019, 6:34 PM

A repair on s51071__templatetiger_p would be nice. The easiest way to do this, would be allow me to import me new data from dumps but this is blocked by
Dump files are here: /data/project/templatetiger/public_html/dumps/

SQL import commands are generated by PHP ( which not run without the database so there is also some help necessary.

A repair on s51071__templatetiger_p would be nice

I am not sure what you mean with "repair"- what errors are you finding? Is it just creating the database, because you can do that yourself (cannot?).

Main problem with repair is that the table "info" is away and this was created by hand and I have no idea about it's structure as it was many years ago. I have the hope that the other tables are generated by script.

@Kolossos I've created the database s51071__templatetiger_p and reimported the info table, let me know if that works for you. Please keep a copy offsite of info next time.

@Wurgl There is some data on the s51412__data old database, but like the others, I cannot guarantee it is in a good state, or that I will be able to recover all. There are already tables on the new s51412__data database, so I will not overwrite any data without your consent (specially given that I cannot guarantee its reliability). I would offer you a dump (with each separate table and each definition on its own file) with whatever I can recover, and you can decide what to import and what to regenerate?

A dump would be fine. I do not need all, some a really large!

It would be fine to have a dump of the following tables: book counter dewiki_book dewiki_external dewiki_external_beacon dewiki_spiegel

Thanks a lot

I found the easiest way to send you the exports is in the own database with a different name. You can read it and explore if you want to SELECT...INSERT or RENAME TABLE ...the _import tables:

[s51412__data]> show tables like '\_import\_%';
| Tables_in_s51412__data (\_import\_%) |
| _import_book                         |
| _import_counter                      |
| _import_dewiki_book                  |
| _import_dewiki_external              |
| _import_dewiki_external_beacon       |
| _import_dewiki_spiegel               |
6 rows in set (0.00 sec)

[s51412__data]> show table status like '\_import\_%';
| Name                           | Engine | Version | Row_format | Rows     | Avg_row_length | Data_length | Max_data_le
| _import_book                   | InnoDB |      10 | Compact    |        0 |              0 |       16384 |            
| _import_counter                | InnoDB |      10 | Compact    |    10571 |             37 |      393216 |            
| _import_dewiki_book            | InnoDB |      10 | Compact    |        0 |              0 |       16384 |            
| _import_dewiki_external        | InnoDB |      10 | Compact    | 12466872 |             45 |   565182464 |            
| _import_dewiki_external_beacon | InnoDB |      10 | Compact    |      139 |            471 |       65536 |            
| _import_dewiki_spiegel         | InnoDB |      10 | Compact    |     3916 |             62 |      245760 |            

I will keep the task open until you check, manage or drop these tables. Remember there is no guarantee of having kept all data they used to have.

Once everbody here is happy, we will stop and decommission the old host (no more recovery possible).

I'm the happiest guy in the world!


@jcrespo Thanks for reinstalling templatetiger info table. But now I ran as predicted in the problem of .Let's work further under this issue report.

@jcrespo please restore the table s51290__dpl_p.dab_hof ; thanks in advance!

@jcrespo please restore the table s51290__dpl_p.dab_hof ; thanks in advance!


Does this seem done at this point? I'm very interested in decommissioning 1004/5, if so.

From my side (s51412__data) it is done. Thanks.

There might be some statistic data in some table, but I do not really care :-)

It's done as far as s51290 is concerned; thanks!

Bstorm claimed this task.

I'll close this. If there is further work, please contact the cloud team in #wikimedia-cloud immediately because the old server is going to be decommissioned very soon. Thanks, everyone.

My comment from March 22nd is still valid. s51412__data is done.