User Details
- User Since
- Dec 12 2020, 11:36 AM (279 w, 2 d)
- Availability
- Available
- IRC Nick
- zabe
- LDAP User
- Zabe
- MediaWiki User
- Zabe [ Global Accounts ]
Today
Yesterday
Should be working again. Following up in T423821.
Sat, Apr 18
Thu, Apr 16
We will start removing the old tables from wikireplicas on 28 May.
Wed, Apr 15
I really want to get rid of this. :)
Fri, Apr 10
If we query file_revision first and support doing a secondary sort by the file name, it fixes the query.
This is what the read old query looks like.
wikiadmin2023@10.64.32.35(enwiki)> analyze SELECT /*! STRAIGHT_JOIN */ file_name,fr_timestamp,actor_user,actor_name FROM `file` JOIN `filerevision` ON ((file_latest=fr_id)) JOIN `actor` ON ((actor_id=fr_actor)) LEFT JOIN `user_groups` ON (ug_group = 'bot' AND (ug_user = actor_user) AND ((ug_expiry IS NULL OR ug_expiry >= '20260410142742'))) WHERE ug_group IS NULL ORDER BY fr_timestamp DESC,file_name DESC LIMIT 51 ; +------+-------------+--------------+--------+----------------------------+---------+---------+-------------------------------+---------+------------+----------+------------+---------------------------------+ | id | select_type | table | type | possible_keys | key | key_len | ref | rows | r_rows | filtered | r_filtered | Extra | +------+-------------+--------------+--------+----------------------------+---------+---------+-------------------------------+---------+------------+----------+------------+---------------------------------+ | 1 | SIMPLE | file | ALL | file_latest | NULL | NULL | NULL | 1018833 | 1003200.00 | 100.00 | 100.00 | Using temporary; Using filesort | | 1 | SIMPLE | filerevision | eq_ref | PRIMARY,fr_actor_timestamp | PRIMARY | 8 | enwiki.file.file_latest | 1 | 0.96 | 100.00 | 100.00 | | | 1 | SIMPLE | actor | eq_ref | PRIMARY | PRIMARY | 8 | enwiki.filerevision.fr_actor | 1 | 1.00 | 100.00 | 100.00 | | | 1 | SIMPLE | user_groups | eq_ref | PRIMARY,ug_group,ug_expiry | PRIMARY | 261 | enwiki.actor.actor_user,const | 1 | 0.28 | 50.53 | 100.00 | Using where; Not exists | +------+-------------+--------------+--------+----------------------------+---------+---------+-------------------------------+---------+------------+----------+------------+---------------------------------+ 4 rows in set (6.654 sec)
Places in core needing migration:
- WikiFilePage::getForeignCategories()
Marked the following revisions as bad. Please note that since revision may share the same content row, and per above analysis this should be the case for every single one of these, this also effects at least another 425 revisions.
Thu, Apr 9
The last place that needs migrating is ApiQueryAllPages.
Tue, Apr 7
First step has already been done in https://gerrit.wikimedia.org/r/c/mediawiki/core/+/1266343.
ruwiki should work again, see T422459.
fiwiki should work again, see T422459.
It might make sense to wait with this one until T416548: Start reading from file table on wmf production is done.
Fri, Apr 3
Disabled QueryPage updates for Special:Unusedtemplates since it caused issues (T422062).
Ok, the difference is that the MediaWiki implementation filters for pages in $wgContentNamespaces and this includes files on commons (see here) while the Hadoop implementation currently only filters for NS_MAIN.
Taking a look at https://analytics.wikimedia.org/published/datasets/querypage/MostCategories/commonswiki.json and comparing it to https://commons.wikimedia.org/wiki/Special:MostCategories it seems like the Hadoop implementation does not include files.
Disabled the updates for Special:Unusedtemplates on testcommonswiki until it properly supports spliting of links tables.
Thu, Apr 2
@Dreamy_Jazz akwiki is a closed wiki, so I guess that is why the table was dropped there. But for some reasons we still try to query it?
Mon, Mar 30
Sun, Mar 29
Sat, Mar 28
Reposting: From those four the archive table seems to be the only one which has significant size (although cusi_case appears to be new and could grow). So I would suggest to restrict efforts to that one for now.
Sun, Mar 22
Mar 21 2026
The bug here is that in the old query design the filtering happened while querying, such that after the filtering we had $params['limit'] + 1 results. Now we query $params['limit'] + 1 items and only then do the filtering, such that we can end up with less items (which also causes the continue element from vanishing).
Mar 19 2026
Hmm. The "Add wikidata support" task is usually more of a "make sure the sites tables are up to date on all wikis" task.
mwscript-k8s -f -- extensions/WikimediaMaintenance/maintenance/createExtensionTables.php --wiki=abstractwiki translate should create the correct tables.
