Blog: https://addshore.com
Twitter: https://twitter.com/addshore
Meta: https://meta.wikimedia.org/wiki/User:Addshore
Wikitech: https://wikitech.wikimedia.org/wiki/User:Addshore
🦓🐝🐢
Blog: https://addshore.com
Twitter: https://twitter.com/addshore
Meta: https://meta.wikimedia.org/wiki/User:Addshore
Wikitech: https://wikitech.wikimedia.org/wiki/User:Addshore
🦓🐝🐢
Hopefully the outcome will be worth it :)
Some of the data I have is also already in other people's public indexes there, (the real question will be can I reasonably reuse that in its current form)
And generally speaking, I'd like to think once I document the data, others might find it useful (for example git commits of all Wikimedia related repos through time will be one of the data sets)
is a relatively big amount of data
In T419994#11755529, @despens wrote:I really like that this change is going to happen!
Perhaps this is a chance to leap to the (obvious) next step, which would allows each Wikibase to set an arbitrary prefix?
+14/15 pods/jobs will mean +15 jobs, is that ok? (we don't have a separated pod limit)
Tested this morning in incognito on the same phone on a fresh user account where the only setting change is to turn on the beta feature, and I still have the issue.
Is there any advanced setup extension wise? I can have a go reproducing locally if someone gives me a pointer or two.
So, I reproduced this right away (with phone screen recording) yesterday.
I click the link to the item in this ticket, I scroll down to the edit links for statements, click edit and the issue can be seen.
Apparently the recording is too big for phab, and my commons upload attempt failed (will try again later)
So I did some further thinking about this on the way home from the hackathon (part of my write up at https://addshore.com/2026/03/wikimedia-hackathon-northwestern-europe-2026/)
Doing a HEAD request does also give you the links in the response
In T414004#11686985, @ItamarWMDE wrote:I can confirm this is still happening with updates from 0.29.2 to 0.30.0
From my vague memories it was split out of wikibase at the point in time, the cognate extension was created to manage interlanguage links on wiktionary.
As the same interwoki storing code that was part of wikibase would not run on wiktionary for the sirelinks that cognate provided.
Hence the split.
Generally speaking id say yes the container should just have python in it.
However currently mwcli doesn't control the images, they are releng / mw developer images.
It's something I have always sat on the fence about, in terms of if mwcli should just build it's own images too tbh!
And a separate one while just making an API call {"error":{"code":"internal_api_error_DBConnectionError","info":"[cb61a3d938a16e2eb840f474] Caught exception of type Wikimedia\\Rdbms\\DBConnectionError","errorclass":"Wikimedia\\Rdbms\\DBConnectionError"}}<!DOCTYPE html>
Currently if we followed Adam's lead and didn't create a proper extension but just hooks that insert those Installed software rows, we wouldn't get anything back from ActionAPI for this.
Yep, this one is very easy to reproduce.
I wrote some more thoughts here today while sat on a train https://www.wikidata.org/wiki/User:Addshore/EditGroups as edit groups came up on https://www.wikidata.org/wiki/Wikidata_talk:Requests_for_comment/Mass-editing_policy again and the mention of making it "first party".
Thanks for the report,
I definitely need to change something here soon (if not just revert the change depending on the dashboard for service resolution).
The proposal should be on wiki at mediawiki.org so as to include @Addshore (the creator of mwcli) and other volunteers.
Basically, with how discussiontools parses comments, the part between the heading and the first signature is interpreted as the "first post".
Is this constrained by the host or client download speed?
What speed can https://files.scatter.red/orb/2025/12/ allow?
And what's your connection speed?
It will not currently work for custom services, though that should be easy enough to also add, and would also be very neat
Can confirm today this still happens
$wgHooks['SoftwareInfo'][] = function( &$software ) {
$software['[https://www.mediawiki.org/wiki/Wikibase/Suite Wikibase Suite]'] = '0.0.0';
$software['[https://www.mediawiki.org/wiki/Wikibase/Suite/Deploy Wikibase Suite Deploy]'] = '0.0.0';
return true;
};This is ultimately fixed in https://gitlab.wikimedia.org/repos/releng/cli/-/merge_requests/635 pending release.
Which switches to local.wmftest.net instead of *.localhost
Some part of this will actually come in https://gitlab.wikimedia.org/repos/releng/cli/-/merge_requests/638 to play around with
In the form of a status page that tell you 1) what sites you have 2) what services are running.
We will trial an alternative approch to things in https://gitlab.wikimedia.org/repos/releng/cli/-/merge_requests/638
There is a "dashboard" with API that "knows" the status of the services.
mediawiki just checks this APi for the state, and also caches for 5s.
This is missing the addition of the updatelog https://github.com/wikimedia/mediawiki-extensions-Wikibase/blob/c23338f0b71ea18c4defc24c5f7b606d04ccca67/repo/includes/Store/Sql/DatabaseSchemaUpdater.php#L249-L252
Needs fixing in Wikibase :)
So I think rebuildPropertyTerms is there from an earlier version
It looks like the update row log line rebuildItemTerms is only added via update.php, and not when the maintenance script itself that does this job is run.
So rebuilding the terms storage in SQL
Right cc @Lydia_Pintscher
It sounds like the "right" way is indeed to have the expectation that wikibases should install the https://www.mediawiki.org/wiki/Extension:ShortUrl and then they themselves can generate short URLs for their own query services, for a consistent feature set across the ecosystem
I believe the only way to use the new API is with a plan, which has limits
In T411634#11429411, @dena wrote:Done in 1 h 23 min 29 s.
logs: https://cloudlogging.app.goo.gl/RnsF4JhzTRxJybRWA
I wonder if we could run out of memory here.
Screenshot from the status page side of things showing the period of time it couldn't edit for
So yes, CI for mwcli currently fails due to this issue with PHP version highlighted by @hoo
So, chatting on this with @karapayneWMDE briefly yesterday.
A short term "fix" so that this doesn't actually immediately break, might be to make a mwcli image that is based on the newer mediaiwki image, however curl will be reverted to a previous version / pinned back so that the expectations around how mwcli works remain the same in the short term.
However I'm not sure if the "libcurl" vs "curl" issue described in the description of this task means this wouldn't actually work.
This would need a brief investigation / attempt.
Great, I'll look at merging this and including it in the next release too :)
Is x3 accessible via either 1) quarry 2) stat / analytic clusters?
I can see that analytics-mysql on the stat hosts has a --use-x1 option, but nothing for x3
Urgff
Looking encouraging