Blog: https://addshore.com
Twitter: https://twitter.com/addshore
Meta: https://meta.wikimedia.org/wiki/User:Addshore
Wikitech: https://wikitech.wikimedia.org/wiki/User:Addshore
🦓🐝🐢
Blog: https://addshore.com
Twitter: https://twitter.com/addshore
Meta: https://meta.wikimedia.org/wiki/User:Addshore
Wikitech: https://wikitech.wikimedia.org/wiki/User:Addshore
🦓🐝🐢
Interesting.
How long have you been using the mwcli doker setup for?
I could imagine you may end up in this situation if either you were updating from a very old version to the current version
@hashar right!
Is there some way to turn off the gerrit mirroring?
Potentially the gitlab changelog commands might be of use here.
Then mwcli doesnt need its own command
I'm not really sure where to put this command, i'm worried about diluting the commands at the top level that will make things harder to see...
Crosslink T340552
Sat next to @WolfgangFahl and just showed them https://github.com/wmde/queripulator which might be of interest in terms of query manpulation, and also label service without needing a label service
https://github.com/wikimedia/mediawiki-tools-cli looks fixed now :)
Removing mwcli as I havn't see this in some time
If / when I implement this
Of if someone else wants to, I'd probably go for https://developers.cloudflare.com/api/#cloudflare-tunnel-create-cloudflare-tunnel at this stage.
I'll go ahead and close this for now as unreproducible, feel free to re open if this is still an issue, but we couldn't reproduce!
So, an interesting "tidbit" here is that MediaWiki does normally come with a default page.
However, I purposefully ignored it when creating the mediawiki schemas that get rolled out for all wikis
That page (or any other page) could get included agin if you copy over move content when making the schemas
https://github.com/wbstack/api/blob/main/database/mw/README.md?plain=1#L109
It sounds like the problem here might lie entirely within quickstatements from my understanding?
It looks like quickstatements does this mapping based on configuration
And the configuration that wbstack code provides is just wrong
How does this look?
At least adding a label or a description in one language is mandatory
Yup, right now arm isn't supported out of the box.
The easiest way to get there for the core feature set would be to move to more docker hub images rather than WMF maintained ones, which are often built to be cross platform.
But there would always be WMF images or production etc that do not have arm builds.
mediawiki-docker-dev is no longer supported anyway, so closing this sounds fine.
So one thing I'll note is that the limited access to settings is by design.
The idea of Wikibase cloud is it can easily scale to 1000s etc of wikibases and the limited set of functionality changes helps that.
MediaWiki and Wikibase individual settings are often far to fine grained to be useful when exposed to a wide group of users, and allowing free access to the settings can have undesired consequences.
The more settings that are exposed, and the more differently configured wikis the harder platform maintenance is.
I did some experiments a few years back (2021)
You can find my writeup from back then https://addshore.com/2021/02/testing-wdqs-blazegraph-data-load-performance/
This was mainly trying different Java and Blazegraph options to see how things changed the behaviour while loading data, but I also tried a bunch of different GCP hardware including different disk setups.
@bking Would it be possible to get me access to an R2 bucket that is paid for by the WMF in some way?
I'll happily continue my manual process of putting a JNL file in a bucket every few months for folk to use until the point that this is more automated?
In T343686#9090090, @Evelien_WMDE wrote:From Telegram, popular request is to introduce "Merge items" linking to /Special:MergeItems under the Wikibase section.
We'll definitely add this, anything else popping up here can be added on top.
The energy savings is possibly unclear, at least under current case (but that's partly because it's hard to know how much energy is being expended, which could be guessed at from number of dump downloads; not sure how easy it is to get those stats; this is different from the bandwidth transfer on Cloudflare R2).
From my side, everything that is in a cron there i believe should be in https://github.com/wikimedia/analytics-wmde-scripts
Specifically to see what is happening within the crons see https://github.com/wikimedia/analytics-wmde-scripts/tree/master/cron
I'm gonna work on actually publishing this soon :)
Adding some usage numbers to this task.
Of the JNL files that I am currently hosting on cloudflare, 4.28 TB traffic has been used in the past 30 days, which equates to roughly 3-4 downloads of the file.