Page MenuHomePhabricator

Decommission WDQS full graph endpoint (wdqs2009)
Closed, ResolvedPublic

Description

Overview

The Wikidata Platform team has decided to decommission the legacy full graph endpoint (wdqs2009) for the WDQS. This endpoint was created to provide a longer migration timeline for users of WDQS with academic and testing use cases dependent on a full graph representation, following the graph split earlier this year. Upon release to the community, it was announced that the legacy endpoint would be only be available until December 2025.

We are requesting SRE support to turn off this endpoint and depool it from the WDQS cluster on Tuesday January 20th, 2026. This timeline will allow for community support through the rest of the year with relevant stakeholders (now - Jan. 6th) and provide time for WDP to triage issues stemming from the change (Jan. 7th - Jan. 9th).

Justification

WDP is primarily focused on the stability and scalability of the Wikidata platform. Maintaining the legacy endpoint is not strategically aligned with these efforts. The key area of investment from now until the end of FY25-26 is migrating away from Blazegraph and towards a more reliable backend triple store system. Reducing operational burden (the legacy endpoint has no SLOs, but SREs and WDP still respond to incidents) and freeing up hw will help us pursue our priorities.

We have aligned upon this work and the requested timeline for the change to take effect with power users of the legacy endpoint.

Details

Other Assignee
Gehel

Event Timeline

At T413097 I made a request for computing resources to support the WikiQlever transition team in keeping the Scholia / WikiCite project up.

Can there please be a decision as soon as possible, and before staff holiday breaks, which are beginning now.

thanks

Thanks, @Bluerasberry. Acknowledging the dependency on T413097 from the WDP side, as well. Unfortunately, we likely won't have the necessary inputs to prioritize that request before the end of the 2025.

@Gehel , is there flexibility on picking this request up in a sprint after the articulated date of Jan 7th? This will be helpful for us in figuring out how to stagger this work.

@Gehel , is there flexibility on picking this request up in a sprint after the articulated date of Jan 7th? This will be helpful for us in figuring out how to stagger this work.

I assume that "this request" means T413097. In which case, this is the responsibility of the Cloud-Services team, so I can't speak for them. We can delay the decommission of the legacy WDQS endpoint if this helps.

We can delay the decommission of the legacy WDQS endpoint if this helps.

Yes please. Given the discussions over in T413097, something like a month would seem appropriate.

We are postponing the decommissioning of the wdqs2009 endpoint.

We can't yet commit to a new implementation date, as we need to better understand the blocker presented by T413097 with @Daniel_Mietchen and the cloud services team. We will post an timeline update on this ticket, ideally by January 14th, once we reach alignment with all stakeholders.

The new date for completing this request is January 20th. @Gehel please advise if you want me to update the description to reflect.

We no longer consider the open questions on T413097 to be a blocker to decommission the legacy endpoint and are moving forward to repurpose this hardware for our ongoing migration efforts.

Mentioned in SAL (#wikimedia-operations) [2026-01-20T14:21:44Z] <gehel> switching off Blazegraph on wdqs2009 (legacy full graph endpoint is end of life) - T411410

The query-legacy-full.wikidata.org/sparql endpoint is now disabled. The UI is still exposed, we need to clean this up soon-ish.

Let's wait at least 1h before starting on the backup of the journal file.

Let's wait at least 1h before starting on the backup of the journal file.

Ack.

A backup of the Blazegraph journal file will be available through March 2026 at: https://dumps.wikimedia.org/other/wdqs/

The journal is 1.3 TB uncompressed and is provided split into 100 gzip-compressed chunks. It can be rebuilt using:

zcat *.gz > wikidata.jnl

The sha256 checksums of the complete file is

1691c15c97d50e442871740b2e2a8fe31b54a22f19972d65c77e0a3993977b15