spark.sql('show partitions structured_data.commons_entity').show()
+-------------------+
| partition|
+-------------------+
|snapshot=2024-12-16|
|snapshot=2024-12-23|
|snapshot=2024-12-30|
|snapshot=2025-01-06|
|snapshot=2025-01-13|
|snapshot=2025-01-20|
+-------------------+Maybe related to T386255: wmf.wikidata_item_page_link and wmf.wikidata_entity snapshots stuck at 2025-01-20? Later data seems to have been successfully dumped though (see https://dumps.wikimedia.org/commonswiki/entities/20250203/) so maybe it's just a problem with ingestion into hive
Downstream tracking task: T385865: Resume data pipeline operations