Private account of @Lucas_Werkmeister_WMDE (he/him, Berlin timezone). Anything I do here is on volunteer time, even if it looks work-related :)
User Details
- User Since
- Jun 5 2016, 4:36 PM (496 w, 5 d)
- Availability
- Available
- IRC Nick
- lucaswerkmeister
- LDAP User
- Lucas Werkmeister
- MediaWiki User
- Lucas Werkmeister [ Global Accounts ]
Yesterday
- In Magnus’ Rust library, request_builder() takes a single params hashmap that goes into the query for GET requests and into the body for all other requests. If you want query params for POST requests, you presumably have to put them into the URL yourself, though I don’t know if there’s an example for that given how rarely it comes up.
- In MediaWiki core’s mw.Rest JS library, post and other non-GET methods take a path string and a body object. If you want query params, you need to put them into the path yourself, and we can actually see that in GrowthExperiments’ fetchUserImpactData().
Plot twist: it turns out that, unlike in the action API, parameters are absolutely not interchangeable between the query string and the POST body in the REST API. (In the Action API, a few parameters have to go into the body, and origin, crossorigin and centralauthtoken have to be in the query string, but everything else can be in either place. m3api puts everything except action and origin/crossorigin in the body.) There are POST endpoints with query parameters (e.g. /growthexperiments/v0/user-impact/{user}, /ipinfo/v0/revision/{id}, /checkuser/v0/temporaryaccount/{name}; none in MediaWiki core AFAICT), and trying to specify those query parameters in the body will yield an error.
Sun, Dec 7
I ran a poll on fedi – options 2 and 3 each got 3/5 votes (so one person voted for both), option 1 got none, nobody suggested anything else. @Legoktm raised the good point that option 3 is best for usage with TypeScript, because you can enforce correct parameters (and, in the case of JSON-returning methods, provide more information about the response structure) with a lot of overloads like:
Sat, Dec 6
Thu, Dec 4
Wed, Dec 3
I’ve set up the repository at https://gitlab.wikimedia.org/repos/m3api/m3api-rest using cookiecutter-m3api; actual content will follow over the coming days :)
Tue, Dec 2
Weren’t we installing a custom Rawdog fork anyway?
Mon, Dec 1
Sun, Nov 30
Or, alternatively, the path is always a string, but we supply a template literal tag function that helps you encode the title correctly:
Suggested usage from m3api-oauth2:
I wonder if the library should include support for formatting the URL path – turning /v1/page/{title}/history + { title: 'AC/DC' } into /v1/page/AC%2FDC/history with a properly escaped / in the title. Probably yes, if I can think of a good interface for it?
If anyone wants to help me bikeshed names, my current ideas are:
Tue, Nov 25
Mon, Nov 24
The good news: an extremely rudimentary, but working version of this can be got with a laughably short patch:
Current outcome:
$ curl -X QUERY -d action=query https://en.wikipedia.org/w/api.php <html><body><h1>405 Method Not Allowed</h1> A request was made of a resource using a request method not supported by that resource </body></html>
Thanks, that’s useful to know. But there have been exciting new developments – QUERY has been approved! ⇒ T410883: Support HTTP QUERY method as standard alternative to Promise-Non-Write-API-Action header
Is this a duplicate of T372910: Make REST Handler router enforce Promise-Non-Write-API-Action header?
Mon, Nov 17
Thanks, good shout – this patch appears to be sufficient to make the m3api-oauth2 tests pass against my local wiki:
Sat, Nov 15
Slightly more readable version of what appears to be the loop:
includes/Request/WebRequest.php(860): MediaWiki\Session\SessionManager->getSessionForRequest()
includes/Permissions/PermissionManager.php(1572): MediaWiki\Request\WebRequest->getSession()
includes/Permissions/PermissionManager.php(1514): MediaWiki\Permissions\PermissionManager->getUserPermissions()
includes/Permissions/UserAuthority.php(271): MediaWiki\Permissions\PermissionManager->userHasRight()
includes/Permissions/UserAuthority.php(130): MediaWiki\Permissions\UserAuthority->internalAllowed()
includes/User/User.php(2129): MediaWiki\Permissions\UserAuthority->isAllowed()
includes/User/CentralId/LocalIdLookup.php(104): MediaWiki\User\User->isAllowed()
includes/User/CentralId/CentralIdLookup.php(238): MediaWiki\User\CentralId\LocalIdLookup->lookupCentralIds()
includes/User/CentralId/CentralIdLookup.php(316): MediaWiki\User\CentralId\CentralIdLookup->nameFromCentralId()
extensions/OAuth/src/Backend/Utils.php(292): MediaWiki\User\CentralId\CentralIdLookup->localUserFromCentralId()
extensions/OAuth/src/ResourceServer.php(180): MediaWiki\Extension\OAuth\Backend\Utils::getLocalUserFromCentralId()
extensions/OAuth/src/ResourceServer.php(157): MediaWiki\Extension\OAuth\ResourceServer->setUser()
extensions/OAuth/src/ResourceServer.php(89): MediaWiki\Extension\OAuth\ResourceServer->setVerifiedInfo()
vendor/league/oauth2-server/src/Middleware/ResourceServerMiddleware.php(54): MediaWiki\Extension\OAuth\ResourceServer->{closure:MediaWiki\Extension\OAuth\ResourceServer::verify():88}()
extensions/OAuth/src/ResourceServer.php(85): League\OAuth2\Server\Middleware\ResourceServerMiddleware->__invoke()
extensions/OAuth/src/SessionProvider.php(279): MediaWiki\Extension\OAuth\ResourceServer->verify()
extensions/OAuth/src/SessionProvider.php(109): MediaWiki\Extension\OAuth\SessionProvider->verifyOAuth2Request()
includes/Session/SessionManager.php(569): MediaWiki\Extension\OAuth\SessionProvider->provideSessionInfo()
includes/Session/SessionManager.php(136): MediaWiki\Session\SessionManager->getSessionInfoForRequest()
includes/Request/WebRequest.php(860): MediaWiki\Session\SessionManager->getSessionForRequest()
<!-- there's a hole in my bucket… -->Hm, I think OAuth might be not so much “resource-hungry” as “broken”? When running the m3api-oauth2 integration tests against a local wiki, with very little custom OAuth configuration, it encounters this error:
(Note that this is after configuring xdebug.max_nesting_level=8192, which is a preposterously high value. XDebug’s default is 512 levels, which I had previously doubled for Peast; this is eight times more than that again and still not enough to break out of the infinite loop. I guess in CI, where XDebug probably isn’t installed, nothing breaks out of the loop and that’s where the timeout comes from?)
Fri, Nov 14
Yes, same as before. This is useful data that I don’t think should be thrown away. (Also, my suggestion to exclude URLs or URL prefixes, as opposed to domains – which, to be clear, I also object to – seems to have gotten lost, as you’re again only talking about domains.)
Thu, Nov 13
As far as I can tell, this is working, and I just added it to the Scribunto documentation. I think we’re done here – thanks @cscott for the code review! \o/
Also I was under the impression that the rsync upload already happens from CI and not from a manual user.
Huh, thanks. @Ladsgroup could you perhaps rm -r /srv/doc/m3api/tmp-m3api-example on doc2003?
Nov 10 2025
I see, thanks for explaining! Then I guess I’ll just remove the update.php call and see if anything breaks or anyone complains :)
Alright, now I think we just need someone™ with the right permissions to effectively rm -r https://doc.wikimedia.org/m3api/tmp-m3api-example/. (Looking at httpd-doc.wikimedia.org.conf, I think it would be at /srv/docroot/org/wikimedia/doc/m3api/tmp-m3api-example/? On doc1004 and/or doc2003?) @bd808 or @jnuche can you do that? :)
Well, as someone maintaining CI templates to install / set up MediaWiki (GitHub action, GitLab CI), it doesn’t sound like I can remove the update.php call yet if I want my templates to be useful for arbitrary extensions, because I don’t know if the extensions other people would like to use in their CI have been converted to virtual domains or not. (Though this is to some extent academic – I’ve advertised the CI templates a bit, but I don’t know if anyone else actually uses them.)
Nov 9 2025
Well, that sounds like it only fixes it for the OAuth extension? What about other extensions?
Nov 8 2025
And I also published new releases of the other packages, so all of these work now:
I’m working around this for now on the main branch by checking out the commit just before the “bad” one.
Okay, git bisect says Deprecate $wgMWOAuthSharedUserIDs is the first bad commit. (I retried each CI job twice, i.e. three builds per commit, to hopefully rule out any flakiness. The results seem quite consistent.)
I’ll see if I can git bisect this down to a specific commit to blame…
Publishing m3api-query docs with the new v1.1.0 release worked \o/
Nov 3 2025
Oops, sorry for not searching properly, this looks like it should’ve been findable.
Oh, this is interesting – it’s only reproducible if the request URL doesn’t have a “proper” path.
Nov 2 2025
Looking at this code in index():
Oct 27 2025
Oct 25 2025
Further commits:
Looking at the code suggests that the $PUB_LOCATION is only meant to allow putting the docs into further subdirectories of the default value; changing the prefix is not allowed. Let’s just go with the longer, slightly redundant URLs then.
Nope, not allowed. CI output (not kept forever because my volunteer account doesn’t have permission to do that apparently ^^):
It works \o/ \o/ \o/
Trying this out in the new tmp-m3api-example repo, using the wip branch (current tip but I may force-push that away later) of the ci-templates.
Oct 17 2025
I believe this task can now be closed (not sure which status is best, let’s go with Resolved for now); thanks to T214998: RFC: Serve mobile and desktop variants through the same URL (unified mobile routing), opening https://en.wikipedia.org/wiki/Kecksburg_UFO_incident?wprov=yicw1 on a mobile device (tested in Firefox on Android) will now stay on the en.wikipedia.org domain (with the ?wprov= parameter intact) and display the mobile site there.
Oct 16 2025
me@host operations-puppet $ git grep curve25519-sha256@libssh.org modules/gitlab/manifests/ssh.pp: Array[String] $kex_algorithms = ['curve25519-sha256@libssh.org', 'diffie-hellman-group-exchange-sha256'], modules/ssh/templates/sshd_config.erb:KexAlgorithms curve25519-sha256@libssh.org,diffie-hellman-group-exchange-sha256
Oct 12 2025
Oct 10 2025
Oct 7 2025
Tagging Reconciliation since Abbe98 suggested the issue might be there rather than in OpenRefine proper.
Oct 2 2025
Sep 29 2025
Sep 27 2025
Sep 19 2025
Sep 18 2025
This is a problem with a local gadget (https://commons.wikimedia.org/wiki/MediaWiki:Gadget-Stockphoto.js#L-427), not the MediaWiki software, so never belonged on Phabricator.
This database table stores URLs that represent the outgoing links from articles. This exists primarily for preventing and finding undesirable links.
Sep 16 2025
\o/ thanks!
I also moved all the to-be-deleted tmp-* repositories to the lucaswerkmeister/ namespace, to get them out of the children.json list immediately. (They’ll still be deleted in a week.)
Hm, I guess we need to pick a group first, I didn’t think about that yet 😅
https://gitlab.wikimedia.org/groups/repos/m3api/-/children.json works (extra repos/), I think that would be okay! (I just scheduled the tmp-* repositories for deletion, though they still show up in that JSON at the moment. If that’s a problem, I can delete them immediately.)
Sep 15 2025
I’ve published 1.0.1 versions of all the libraries (Mastodon announcement), and with that I think this is done! (The remaining subtasks will hopefully happen sooner or later but I don’t consider them blockers.)
Sep 14 2025
Alright, I’ve created all the real repositories on GitLab and manually copied over all the releases. At some point during the next week (hopefully), I’ll publish new releases of the libraries (1.0.1, probably), both to get the new repo URLs published more widely and to test the release CI. (I don’t want to do it today, so that the release creation dates between the imported and fresh releases are different.)
o_O apparently it shows the release date in the list of releases…
…but the creation date when showing the individual release:
I’ll take it, I guess.
Recreating all the GitHub releases on GitLab. Just a bit of manual busywork. (Notably, GitLab releases let you set the release date, so it should be possible to reflect the original release date. Yay!)
Sep 13 2025
We’ll only know for sure when I do the actual migration and the first releases from GitLab, but I think as far as I can tell now, this is done.
As mentioned in T392716#11178593, for now I’m leaving the docs on GitHub pages, so this task is no longer blocked on T392716. Which I think means the actual migration should be good to go now…
For now I’ve updated the CI templates to push the docs to GitHub, so the documentation can continue to be hosted on GitHub pages, unblocking the rest of the m3api migration. Maybe some day this task will happen, but after almost five months I really don’t feel like letting it block the migration any longer.
Sep 12 2025
Sep 10 2025
Thanks a lot! \o/

