Mon, Feb 22
A couple of follow-up things.
Sat, Feb 20
A bit more about the use-case. Early next month the external Viae Regiae project, with which Wikidata:WikiProject Early Modern England and Wales is closely co-operating, will start a mass participation effort to transcribe all of the places and placenames on several series of 16th and 17th century maps, like this 1576 Saxton map of Essex on Commons. (They will actually be using a higher-resolution copy, which we will be uploading).
Nov 26 2020
Sep 21 2020
Platform team's IIIF project homepage: https://www.mediawiki.org/wiki/Core_Platform_Team/Initiatives/IIIF_API
I've written up a bit about today's zoom call at Commons:Village Pump, here. Feel free to add / correct / amend as desired.
Sep 17 2020
Oh, now I've found it. The speech-bubble button at the top left of the view toggles a display of all of the annotations recorded on the image -- very, ''very'' nice work. Thank you!!
That Mirador view is very nice. I like the way that it is able to operate from an IIIF JSON manifest generated from the wikidata item for the underlying object (code built by @LucasWerkmeister I think), which then allows a description of the image to be displayed using the (i) button at the top right.
Sep 16 2020
Long-term subscribers to this ticket will be excited to see T261621 Support the addition of the IIIF API for Wikimedia projects regarding content partnerships created two weeks ago
plus this announcement that's just appeared on the IIIF community's IIIF-discuss list ( https://groups.google.com/g/iiif-discuss/c/r9yf2GnaF1U ) :
Aug 10 2020
Aug 3 2020
The answer is, we ought to remove the unused ones from the RDF dump too, as discussed at T258474
@dcausse It *does* hurt a person who is trying to make sense of the dump, because they will see all these unfamiliar prefixes declared that they may then assume there will be corresponding kinds of predicates or objects that they have to make sense of.
Aug 1 2020
Ticket description should be re-written.
In some ways it's quite nice that WCQS uses tinyurl rather than the w.wiki shortener -- at least it means there is not such a limit on how long the query can be. (T220703).
Jul 24 2020
It would be helpful if at least one of the rdf:type statements were retained, as they make it easy to select a subset of M-IDs for a query to work on
Most of the above prefixes will be unnecessary, unless we propose to create any new properties local to Commons not defined on Wikidata.
Given that mediainfo items are just of the form sdc:M12345, is much meaningful autocompletion for these actually possible?
It also might be worth making the M-ID number prominently visible on the structured data tab of the filepage, given that this is where the information related to that ID is shown.
@Lucas_Werkmeister_WMDE : I think it would be more feasible to add the Special:FilePath URL to the WikibaseMediaInfo RDF, and combine WDQS and WCQS that way
PS. It's also stupidly hard to find the M-ID from a WikiCommons file page at the moment. This would be a good thing to display in the "structured data" tab there, I think.
I think you meant
wd:Q123 wdtn:P18 sdoc:M6919529
in that second line ?
IMO the best solution here would be to add triples of the form
Mar 17 2020
Yes, there's a cost to you of providing a service based on current WDQS, that then has to be ripped out for a new version based on WDQS 2.
Mar 16 2020
Engineering a completely new search facility for Commons Data rather than using SPARQL is a *stupid* *waste* *of* *time* *and* *resources*.
Mar 4 2020
Mar 2 2020
That's a bit of a problem, given what the badges are for...
I tried to add an "intentional sitelink to redirect" badge on the English sitelink for asteroid 6765 Fibonacci, but got "Could not save due to an error. The save has failed."
Feb 28 2020
regarding editors : it looks like some works have multiple editors, none of which have Wikidata items. These ought to be given in separate statements distinguished by "series ordinal".
But it may be that because the statements both have the same 'main value' (or, rather, they both have somevalue for the main value), the software used to add the statements may have coalesced them together. QuickStatements tends to do this, I think.
- "named as" = name given to the subject of the statement
- "stated as" = how the object of the statement was stated
Feb 19 2020
@Lucas_Werkmeister_WMDE The qualifier "stated as" (p1932) is currently used on 6.6 million statements. I couldn't get a query to complete to count how many of those statements have an object that's a blank node. My guess might be on the order of about 10,000 but that's just a number pulled out of the air, not based on anything. Could be a *lot* more, if this mechanism has been used eg for scientific papers with unmatched editors, publishers, etc.
Feb 11 2020
Feb 8 2020
Example of a Listeria tracking page, counting how many blank nodes are being used this way for the properties used on a particular set of items (in this case: a particular set of books, where the publisher (known) may not yet have an item, or at least not yet a matched item): https://www.wikidata.org/wiki/Wikidata:WikiProject_BL19C/titles_stmts
Please don't think or refer to the blank nodes as just "unknown values".
Jan 16 2020
Thanks Matthias, but that doesn't seem to help with the use-cases above, if a claim is just sitting there as a claim, and not being used by any Lua template (as most "depicts" claims probably wouldn't be?)
Dec 9 2019
Lead ticket for Vue migration for Wikidata would appear to be T157014 . After sustained activity in 2017, followed by a short spike in June-July 2018, it's not clear how much further progress has been made, or is currently anticipated on this. There is a mention that the Lexemes roll-out in 2018 included some Vue templates and widgets with PHP server-side rendering, which was going to be reviewed.
WMF projects use OOUI and org-wide design principles. Wikibase does not.
Dec 7 2019
Given the endless, and seemingly ongoing, difficulties the decision to fork the wikibase UI is leading to, and seems to be continuing to produce -- most types of statement still not available, somevalue not available, deprecation not available, references not available -- at what point does it make more sense to withdraw the forked UI as a costly experiment that hasn't worked, and is causing more trouble than it is worth?
Nov 29 2019
@Pintoch That's great. I'll try and get matching these this weekend. Is there any chance of the full dataset, beyond the first 1000? We currently have 1660 items for bodies of water in the UK (plus more that possibly don't have P31s), so it would be nice to be able to try to match them all. But thanks again for this!
Nov 23 2019
If maxlag is to be based on the maximum lag of the pooled servers, will there be active measures to monitor these, and take any really badly lagged server (ie significantly worse lagged than any of the others) out of the pool, and out of the maxlag calculation, to give it a chance to recover ?
Nov 15 2019
I would love it if somebody would take this on. I think it's a really worthwhile resource to link to, and a really significant topic to improve our data for.
Nov 14 2019
Hi @Pintoch. Thanks for this.
One thing that seems odd (to an outsider like me who knows very little about the system) is that some servers seem to be performing so much worse than others.
Oct 31 2019
@Lucas_Werkmeister_WMDE How dependent is the wikibase constraint system on SPARQL ?
Maarten was just suggesting that a functioning SPARQL service is required for any of the constraint checking to operate (and so this ticket would be completely blocked until CDQS is reliably up and running)
Is that correct? Or are any of the simplest of the constraint checks (eg format constraints, one-of constraints, etc) available without SPARQL ?
Oct 30 2019
Oct 27 2019
I very strongly agree with this ticket. It needs to be as easy as possible for query-service users to find queries that illustrate "how do I do this?"
Use-case example to show why this is urgently needed: The property source of file (P7482) is intended to show the broad nature of the origin of a file.
Example: property source of file (P7482) is intended to show the broad nature of the origin of a file.
Oct 15 2019
@Lydia_Pintscher - Having now kicked a few possibilities around on Project Chat, can we go for creating badges with:
There is some sense in what MisterSynergy says, but I also think there is sense in what @deryckchan wrote in the RfC (here), namely that there still ought to be some warning if someone tries to sitelink to a redirect page, and the user should be made to actively confirm that they didn't want to instead link to the redirect target. Otherwise I could see us ending up with a lot of accidental sitelinks to mis-spellings, which the present system mostly keeps us safe from.
Oct 14 2019
Thanks Lydia. I've started a thread at Wikidata:Project_chat#Badges_for_sitelinks_to_redirects to quickly see if there are particular icons people would prefer; and created two items, Q70893996 ("sitelink to redirect") and Q70894304 ("intentional sitelink to redirect") where we can start to assemble translations.
The badges, and corresponding bot, might be a nice quick win for Wikidata's 7th birthday.
Mostly, the existence of a sitelink to a redirect indicates a potential data problem on Wikidata: a sitelink that has been left over when two Wikipedia articles have been merged, but no corresponding merge has been made on Wikidata. A sitelink to a redirect can therefore be a strong indication that an item on Wikidata is a duplicate of another one, and should be merged with it. However, at the moment it is not possible for somebody browsing a Wikidata article to readily spot that a sitelink points to a redirect. A badge to identify this would bring the fact into plain sight.
Oct 12 2019
For discoverability, maintenance, and reuse, it is as important to be able to store metadata for datasets via SDC as it is to be able to store metadata for images.
Are we any further forward with this?
Sep 27 2019
Sep 19 2019
On a separate but related issue: we now have quite a lot of images of old maps on Commons, with coordinate georeferencing allowing the maps to be "warped" to standard coordinate systems. It would be nice to be able to serve the warped versions of the maps as tilesets, allowing users to compare different historical representations of given places as layers.
Sep 18 2019
Yeah, this is a problem. I recently had a new Wikidata item (Q66458942) deleted, because the admin wasn't able to see that it was being used as a value in SDC statements on several files, even though he'd checked https://commons.wikimedia.org/wiki/Special:EntityUsage/Q66458942 (cf discussion on admin's talk page).
Aug 22 2019
The aim was to try to identify the different generic stages of the workflow in machine vision, and to think how each stage could be made as "pluggable" as possible, so how users could be made able to plug in their own approach at any stage, and report their results from that stage back into the overall pipeline.
Aug 17 2019
To link the original file to these objects, three new SDC properties are proposed:
Aug 16 2019
Quick update for those following from home.
So for example this afternoon @Miriam is going to be leading a workshop session introducing image classifiers and clustering algorithms, with a view that Commons users can start to explore writing their own large-scale machine-learning image analysis tools. It would be good if there was syntax in place so they could write the results of those explorations to Commons SDC, for that they were there and accessible for the community to then refine or extract or take further for each image. Similarly two days ago @Multichill led a small group session, to try to think through what was the generic pipeline and workflow for machine-learning contributions, and what kind of open framework was needed to support bulk contributions of that kind from allcomers and any set or subset of images. @Fuzheado too has been talking about some of the investigations he has been doing with machine vision and the Metropolitan Museum collection. All those voices, and more, I think would have useful input for this conversation.
Aug 15 2019
It would be good for unassessed suggestions for statements (and not necessarily just "depicts" statements) to be accessible via the SPARQL query service, in the same way as regular statements, but possibly set with a lower trust level.
Aug 14 2019
Resource page (under development): https://commons.wikimedia.org/wiki/User:Bertspaan/maps
Aug 13 2019
By the way, I have a Wikidata property proposal suggested for external georeferencer URL, since a basic thing we will want to record is whether an external service has georeferencing for an image, and if so what the relevant URL is.
Aug 12 2019
@bert I'll be there; I'm coming in on the Tuesday afternoon flight from Edinburgh, and then I'll be at the Comfort Hotel Xpress Stockholm Central.
Aug 6 2019
@Bugreporter : T225778 "Define canonical URI for EntitySchemas" was opened a few weeks ago as a specific ticket for what the canonical URI should be, so I've added that in as a subtask for this ticket.
Aug 5 2019
A couple of issues:
Aug 1 2019
I do agree with @Legoktm that it would be nice to have a service where one could record and browse one's own and other people's recent queries -- and perhaps tag them into particular categories of interest.
A core point of this service was to be to able to share long queries. (I'm guessing WDQS now accounts for the majority of organically requested shortenings?)
Jul 29 2019
It would be nice -- and not just for depicts statements -- for there to be an additional rank, beyond "preferred", "normal", and "deprecated", so that statements could be given the rank "suggested by machine, but further confirmation desired".
Jul 27 2019
@bert Interesting proposal. But for me it raises some issues. Firstly, about the business case for it. Secondly, regarding implementation.
Jun 25 2019
Agree with @alaa_wmde that no requirement's yet been made out for having structured data on EntitySchemas within Wikidata. But it would be useful to have the EntitySchemas themselves (not surrogate items for them) represented in WDQS, so that they can be queried. That is the subject of ticket T225701 "Add EntitySchemas to the Query Service". A federated service, rather than putting them in the main WDQS triplestore, would probably satisfy this.
Jun 21 2019
^^ I saw exactly what Wurgl reports, browsing long pages on en-wiki (eg the Fram case) in London. Now seems okay.
Jun 14 2019
According to this post on Wikidata-l, there is a standard RDF serialisation (ShExR) for Shape Expressions, with a test suite on the ShEx spec github.
The community has now given the thumbs-up to Wikidata:Property proposal/Shape Expression for class, to link a class item to the Shape Expression that members of it should conform to.