User Details
- User Since
- Oct 22 2015, 3:45 PM (529 w, 5 d)
- Availability
- Available
- IRC Nick
- Abbe98 [m]
- LDAP User
- Abbe98
- MediaWiki User
- Abbe98 [ Global Accounts ]
Fri, Nov 21
Nov 3 2025
This might also be of interest:
Oct 17 2025
Just to clarify, nothing on the OpenRefine end is broken, Wikimedia.cloud simply blocks OpenRefine's API access.
Oct 16 2025
Oct 15 2025
Is the code available so that we can try to reproduce it?
Aug 27 2025
Affected SPARQL backends appear to at least include Fuseki and Virtuoso.
Aug 26 2025
@Johannnes89 ah, yes of course! Thank you.
OpenRefine users are reporting that authenticating with a bot password stopped working a few days ago. Has there been additional changes that could have caused this?
Aug 6 2025
Jun 22 2025
When navigating the page by keyboard (by pressing tab) the dropdown will be skipped and only be focus-able after the searchbox has been focused but it's behind the searchbox in the navigation order. The dropdown should probably be focus-able from the start and appear if it's focused to avoid a situation in which users need to navigate forward-and-then-backwards to use it.
Jun 16 2025
Thank you so much @Reedy !
Jun 4 2025
Apr 22 2025
Feb 26 2025
These buttons currently end up in a very weird place DOM-navigation/order wise. When you enter the edit label mode you need to tab-backwards to access them rather then accessing them after the inputs.
Jan 29 2025
Real world use case: https://codeberg.org/abbe98/wikidata.nvim#alpha-software
Jan 28 2025
Dec 30 2024
Is there anything here actually blocking? The to me it seems like the patch solves the use-case outlined in the task description.
Dec 11 2024
At the time of launch it was decided during community consultation that the service would remain in beta until the authentication would be removed. It seems like that consultation was forgotten and here we go again.
Nov 22 2024
@ArthurTaylor how do you imagine changing the current RDF export format without causing massive breakage for everyone relying on the data model to be stable? I'm sorry for my short comment but I felt that your proposal was rather destructive, I would expect there to be more to your suggestion than that of your comment?
change our current RDF export format
That explains my confusion over at T371752.
Aug 29 2024
While "ownWorkDefault": "own", is stil respected insofar as hiding some of the options in the usual UploadWizard you are still presented with two other choices ("do you know what own work means" and "is it really educational") both of which interfere significantly with the intention of campaigns, i.e. being extremely streamlined upload paths for a pre-approved narrow category of content.
I'm not sure that's related given that it looks like all the licensing options listed in the documentation is broken/ignored: https://www.mediawiki.org/wiki/Extension:UploadWizard/Campaigns#licensing
Aug 28 2024
@Alicia_Fagerving_WMSE @Lokal_Profil it looks like the "rights" section of the campaigns are broken. Looks like all the customizations are gone(no third party content, simplified licenses, etc).
Aug 19 2024
I know there was a wish from WMSE to be able to track from which map/tool an upload was initiated, is there an intent to add something like that to the campaign? Assuming it would be implemented by passing a category name as an URL parameter, it would be easily implemented on my end.
Please give me a ping once the new pages are in place so that I can update the links from the WLM map:
https://github.com/fornpunkt/wlm-karta/pull/2
Aug 9 2024
@VIGNERON got a ton of Lexeme related ideas he should add :-)
Aug 6 2024
Could it be that something else is wrong here? The ontology contains wikibase:Lexeme, wikibase:Form, Wikibase:Sense but the properties refers to the wikibase:Wikibase* form, however, the EntitySchema property refers to the wikibase:EntitySchema one.
Jul 28 2024
Jun 20 2024
Amazing thank you @LucasWerkmeister !
Jun 11 2024
@BBlack & @HShaikh at no point have I claimed that a few parties consuming massive resources isn't an issue, I know it's. I suggested setting up separate instances in my initial suggestion because of this and I don't see the issue blocking that; if you aren't making the argument to put existing public APIs behind auth as well.
For the creation of such services on the WMF infrastructure and provided through the existing APIs is a request that I am afraid is out of my wheelhouse. The mandate for the Enterprise team was and is specifically set to provide services for and optimized for large reusers.
Jun 5 2024
I am not understanding how that would necessarily make it an open proxy for use. From my understanding, most cases where APIs are being incorporated into code that is meant to be open sourced can be done via use of libraries or config files.
May 23 2024
Mar 25 2024
@Pigsonthewing it seems like both of the tools are currently working. Given that they was marked as broken at the same time I'm guessing it might have been a temporary issue with WMCS at the time.
I wouldn't expect the API to interact with actual redirects at all and if it's decided that the CORS-webserver option is better I wouldn't expect that either to effect anything other than the initial redirect, thus an user could still get a CORS error down the chain as intended by the downstream target(s).
Mar 1 2024
I would somehow expect it to be the full URL, as it fits with the mental model of a URL shortener but in practice it probably do little to no difference.
Feb 21 2024
@TheDJ sums up the situation well. Changing the way the encoding works at the moment would probably break more than it fixes.
A query like "action=query&list=shorturl&suurl=https://w.wiki/1" would however only return the URL behind the one given? From my point of view that would resolve the use case.
Feb 20 2024
Feb 12 2024
Would having the prefix-definations checking against an array with named keys before defaulting to the existing values be an acceptable interface?
Feb 7 2024
I'm not entirely sure how to migrate this one as the jobs part was implemented by WMF staff. I'm not sure how much it's in use today but in the past various OSM tools relied on it so it might break things if it go away.
Thank you @dcaro, I will however migrate this tool off Toolforge and just put up a static "this tool has moved page" to catch people accessing the old urls.
Thank you @dcaro! I will give it a try over the weekend.
Dec 9 2023
Jumping in to say that I want the tool to stay up past the 14th of December.
Jumping in to say that I want the tool to stay up past the 14th of December.
Jumping in to say that I want the tool to stay up past the 14th of December.
Jumping in to say that I want the tool to stay up past the 14th of December.
Nov 23 2023
Incase someone else comes looking here, the ontology source can be found here: https://gerrit.wikimedia.org/g/mediawiki/extensions/Wikibase/+/4d3950fc2dd8c618b238d50f5f74123e9cc053a5/docs/ontology.owl
Oct 30 2023
If it is in the document served by WikibaseManifest, then as a tool author I have the same problem as currently: if I wanted to make sure that all of my tool's configuration can be derived from the contents of the WikibaseManifest, then whenever I need to introduce a configuration parameter to implement a new feature in my tool, I first need to submit a patch to WikibaseManifest for it to expose this information (be it namespaced or not), get it merged, get a new version of the extension be released and deployed on the wikis I want to interact with.
So, even if we are assuming that the WikibaseManifest extension serves its information by following some particular ontology, the tool-specific information will not be available there. That means we cannot make it possible to just set up a Wikibase instance in OpenRefine just by providing the URL of the Wikibase instance and letting OpenRefine discover the configuration there. Or how would OpenRefine discover this additional information it requires?
I'm interested in working on this across OpenRefine, the WikibaseManifest extension and "TIB's" reconciliation service but I wonder if this work wouldn't benefit from being migrated to a proper vocabulary for describing APIs and knowledge graphs. Like Hydra and DCAT or schema.org instead of relying on a JSON file with more or less tool specific fields. It would have a lot of benefits like interoperability and extensibility, so one could for example have an Wikibase manifest describing multiple reconciliation services, or describing any API with an OpenAPI spec, or a W3C URI, etc. We could then just describe the reconciliation and for things like the template for edit summaries one could just use a custom property?
May 30 2023
May 14 2023
Not 100% sure it's Podman's fault but on Fedora(where the default docker package now being podman) mw docker mediawiki create fails with:
Jan 5 2023
Hey, @Abbe98. Charlie justified the changes done to the original ticket description in the following comments: https://phabricator.wikimedia.org/T296135#7548801, https://phabricator.wikimedia.org/T296135#7549054
Dec 26 2022
I'm not sure what happened here, the original description is gone and the solution isn't even related.
Nov 8 2022
Will there be any attempt at migrating tools automatically?
Oct 3 2022
Sep 22 2022
Jun 14 2022
Lite off-topic men vet någon vilken del av BBR som ingår i WLM? Bara byggnadsminnen eller allt?
May 10 2022
@Susannaanas the Wikidata Image Positions tool supports "named place on map"!
https://wd-image-positions.toolforge.org/file/Map_India_and_Pakistan_1-250,000_Tile_NF_45-4_Faridpur.jpg
Mar 14 2022
@Lucas_Werkmeister_WMDE did you find a good solution?
This currently makes it hard to migrate various tooling to SDC as one can not reliably expect media uploaded with Upload Wizard to have basic SDC-data.
Mar 10 2022
Is there any public documentation regarding the selection of the statuspage.io service and which users this is supposed to serve?
Feb 7 2022
Pattypan version 22.02 is now available.
Pattypan version 22.02 is now available. Pattypan now does not need OpenJFX to be installed and it supports Java 11+.
Pattypan version 22.02 is now available.
Feb 3 2022
I have implemented it on the Kulturlämningar one on FornPunkt.se (example), and will do the same on Kyrksök.se before writing something about it on my blog. Let's see then if WMSE would be interested in doing a version of such a post on the wikimedia.se blog?
Feb 2 2022
@Alicia_Fagerving_WMSE below are the two additional campaign configurations:
Dec 21 2021
Nov 30 2021
Considering that FastAPI, Django, and several other prominent Python frameworks support(and sometimes require) ASGI servers this would be great. The current alternative is Cloud VPS in case one wants to stay on WMFs cloud services.
Nov 29 2021
Note that's experimental so start small :-)
Nov 20 2021
Nov 14 2021
Those coordinates are indeed just the center of the map which is in many cases useful as not all maps show coordinates in the title nor a single point, however, they are just confusing when used in <maplink> as per the example.
Nov 8 2021
Do you (or another pattypan developer) subscribe to the mediawiki-api-announce mailing list? Or Tech News? What "normal channels" are you typically following?
Nov 7 2021
These tools are all affected by T280806, and the outreach seems to have targeted bots and user scripts. I personally never caught any of the outreach through the normal channels as the framing has been "these things we deprecated back in 2014"(although these API calls have been widely used after the 2014 change) nor have any "warnings" reached me even though Pattypan is one of the most popular batch upload tools to Commons used by many chapters and GLAMs.
See also T293543.
Nov 3 2021
The issue appears to have been resolved on Bing's end following the online coverage.
Oct 27 2021
Jul 28 2021
Note that for some users(like myself) an instruction on how to set up Curious Facts locally might be sufficient.
May 22 2021
ResourceSync is indeed a protocol for informing other databases about one's contents or changes(similar scope as OAI-PMH) and as far as I know, it hasn't had many implementations.
May 21 2021
Details are now available at https://www.mediawiki.org/wiki/Wikimedia_Hackathon_2021/Data_Challenge