User Details
- User Since
- Nov 9 2016, 7:25 PM (368 w, 5 d)
- Availability
- Available
- LDAP User
- Unknown
- MediaWiki User
- Pintoch [ Global Accounts ]
Oct 31 2023
Ok, I think I still don't understand it fully, but I trust you do and I won't stand in the way ^^
Oct 30 2023
For OpenRefine(and other tools) the benefit would be that RDF ontologies can be extended very easily so that tools can define their own (namespaced) properties.
Thanks for reviving this thread I had forgotten about ^^
Sep 28 2023
@Jason.nlw it is possible that the issue is on the OpenRefine side and not on the Wikibase.Cloud (not sure - I haven't checked!).
What is the preferred way to listen to the updates of a Wikibase.Cloud instance? (For instance, how is the Query Service updated?)
Currently EditGroups only supports listening to updates via the Wikimedia Event Service, which is obviously only available for Wikimedia projects and not for Wikibase.Cloud. Porting it to other Wikibase instances would likely involve adding support for updating via Recent Changes polling.
There is a ticket about this:
https://github.com/Wikidata/editgroups/issues/5
Sep 14 2023
Thank you @DannyS712! Let's say this patch solves the problem :)
Aug 2 2023
Yes, if it was unclear from my comments I can try to clarify again here. From an API user perspective my preferences are (from most preferred to least preferred):
- wikibase-entityid datavalue type, for consistency with the targeted user experience (EntitySchemas being entities themselves)
- string datavalue type, which is inconsistent with the UX but has the benefit of being an established datavalue type, which any existing API client library is bound to support already (Wikidata-Toolkit, Wikibase-SDK, pywikibot, …)
- A new datavalue type, which will likely require some light changes in most API client libraries, and perhaps lead to failures of various severities until those changes are made (for those which have not anticipated the introduction of new datavalue types)
Jul 19 2023
@Addshore wrote a blog post summarizing the options around this problem and I think it's a very worthy read:
https://addshore.com/2023/07/wikibase-and-reconciliation/
Jul 16 2023
May 11 2023
It's not just useful for testing purposes. For applications like OpenRefine, which normally run on the user's machine directly and are not meant to be hosted, it is important that the callback can be a localhost URL, therefore using HTTP. OpenRefine itself runs as a locally hosted web app (typically at http://localhost:3333/).
May 4 2023
If it's not too difficult, it would be helpful to have that behavior in list=recentchanges too, typically for tools which do recent changes polling.
(EditGroups sadly uses the EventStreams service and not RC polling directly - not sure if changing list=recentchanges would also impact the EventStreams as well?)
Thank you both, this should simplify the EditGroups tool quite a bit! Here is an example of page where I had to render entity links manually, with some Javascript post-processing code:
https://editgroups.toolforge.org/b/CB/3683873dde8d/. (I guess I will keep this logic for a while, for all the revisions fetched already, but it could be dropped in some years)
Mar 1 2023
Thanks for the heads up, I have opened an issue about it on OpenRefine's side: https://github.com/OpenRefine/OpenRefine/issues/5658
Jan 24 2023
Agreed, I would say PAWS replaces this task.
Jan 5 2023
Here is the current status of this issue:
Dec 10 2022
A feature request came in: handling changes of username on the wiki. I suspect this is a feature that would likely come "for free" in any reimplementation of the current tool as a MediaWiki extension, because it would rely on the existing SQL tables in MediaWiki to represent users.
Oct 14 2022
Oct 2 2022
Aug 31 2022
This works fine when using OpenRefine locally, so it probably has something to do with the deployment indeed. I have not checked, but I suspect the frontend code redirects the user to the project page using some relative URL, which fails in the PAWS context.
I do not know if this can easily be done, but I would assume one radical way to ensure that this does not happen would be to host OpenRefine at the root URL of some automatically-generated domain (instead of using a subpath).
If there are sensible fixes that can be applied on OpenRefine's side to make this easier, we can totally consider them.
Aug 23 2022
Aug 22 2022
Aug 16 2022
hello all, it's just caused by the change of format for our download URL. I have added a suggested change to the pull request that should hopefully fix it:
https://github.com/toolforge/paws/pull/187#pullrequestreview-1074315167
Aug 13 2022
Jul 26 2022
Jul 19 2022
Jul 12 2022
Sounds good! I think @Spinster also intends to write a few follow-up tickets.
Jul 9 2022
Should we close this as done?
Jul 5 2022
This bug was mentioned today at LSWT'22 by @SebastianHellmann as a blocker for integrating Wikidata in linked data applications.
Jun 23 2022
Jun 19 2022
Jun 17 2022
May 21 2022
Thanks to @matthiasmullie we now know that this API response cannot return the pageid, because it is not known when the API response is sent back. The pageid is allocated asynchronously afterwards.
May 15 2022
May 8 2022
However, wouldn't it be even simpler for the user to not even need the page id, and be able to add claims with a Commons file name?
Apr 28 2022
One helpful step in that direction would be to return the page id in the JSON response of a successful file upload. That should be fairly straightforward and would bring the number of requests required from three to two.
Apr 2 2022
I have not checked but I trust you!
Mar 29 2022
Mar 13 2022
Closing this data import task because it was created for a training workshop and therefore does not need to be done.
Closing this data import task because it was created for a training workshop and therefore does not need to be done.
Closing this data import task because it was created for a training workshop and therefore does not need to be done.
Feb 23 2022
@Eugene233 this would be worth having a look at, I think :)
Feb 21 2022
Ok, thanks a lot! So, looking at those logs:
@Antonin would running the service with OpenRefine address your comment?
Feb 18 2022
Nice! Is that really all? I am pleasantly surprised! Can you also try for:
- A batch of reconciliation queries, with multiple files (so, converting multiple file names to mids)
- A data extension query with multiple mids, fetching multiple properties on them, one of which returns Wikidata items as values (for instance "depicts")
Feb 15 2022
Jan 28 2022
As pointed out by @Spinster we need to update the property suggest endpoint so that captions are suggested there too:
Jan 27 2022
Jan 26 2022
The free plan should be enough.
Jan 25 2022
yes we can suggest anything that is a property in the sense of the reconciliation service (which is broader than what Wikibase calls properties), including wikitex, captions and other things like that.
Jan 19 2022
Jan 8 2022
Jan 6 2022
Example request: https://commonsreconcile.toolforge.org/en/api?extend={"ids":["M12345" ],"properties":[{"id":"Cen"}]}
Jan 5 2022
This should only be merged once the service is actually implemented (so I would say we could have had a single task instead of three).
One simple solution to this is to include the language code in the name of the reconciliation service, like I do in the Wikibase recon service: https://github.com/wetneb/openrefine-wikibase/blob/master/app.py#L130
Let's make progress on this one!
We could set up a downtime notifier to periodically send data extension queries to the service and check that the response is right.
This could be done via https://www.downnotifier.com/ for instance.
@Eugene233 could you set that up? We can then review the results in some weeks.
Dec 2 2021
One important thing to document is the properties supported by the reconciliation service (so that people know what they can fetch).
For instance, it would be difficult for people to discover that they can fetch the wikitext. Depending on the outcome of T296062 and T295278, there could be other things to document there.
I think this is solved!
Nov 30 2021
Nov 19 2021
@Ladsgroup thank you so much, that is massively helpful!
I think we can close this task, since the goal was to do an import in the scope of a wiki event, and we should not expect people to import more data beyond the event itself.
In the Wikidata reconciliation service, users can fetch labels, descriptions and aliases for any language (regardless of which language version of the reconciliation endpoint they are using).
Nov 9 2021
Curious! We will investigate more.
Oct 30 2021
Oct 29 2021
Oct 26 2021
@Eugene233 I have deployed your implementation on Toolforge. If you try it out on the reconciliation testbench, you will see that it does not work yet: we are missing the CORS headers :)