After the code in T131960 is merged, we need to wait for a new dump to be ready and reload the data in WDQS servers, to update the encoding for sitelinks.
Description
Description
Details
Details
Project | Branch | Lines +/- | Subject | |
---|---|---|---|---|
operations/puppet | production | +1 -1 | wdqs - send ldf traffic to wdqs1003.eqiad.wmnet |
Status | Subtype | Assigned | Task | ||
---|---|---|---|---|---|
Resolved | Smalyshev | T131960 "_" character encoded as %20 in Wikidata URI RDF serialization | |||
Resolved | Smalyshev | T166244 Reload WDQS data after T131960 is merged |
Event Timeline
Comment Actions
Change 363596 had a related patch set uploaded (by Gehel; owner: Gehel):
[operations/puppet@production] wdqs - send ldf traffic to wdqs1003.eqiad.wmnet
Comment Actions
After discussion with @Smalyshev, the data reload procedure should be:
- 1 server at a time on eqiad
- we can reload 2 servers at a the same time on codfw as the traffic there is minimal
- care must be taken to send LDF traffic away from wdqs1001 before reloading it. The appropriate puppet patch should be merged during puppet swat and reverted once the reload is completed
- dumps can be downloaded and pre-processed in parallel on all servers
Comment Actions
All servers reloaded except for wdqs1001 which will probably decommissioned soon due to T171210. If that takes more time than expected, may have still to reload 1001 too.
Comment Actions
Change 363596 merged by Gehel:
[operations/puppet@production] wdqs - send ldf traffic to wdqs1003.eqiad.wmnet
Comment Actions
Mentioned in SAL (#wikimedia-operations) [2017-07-26T19:01:05Z] <gehel> depooling wdqs1001 for data reload - T166244