Tue, Oct 15
@Lydia_Pintscher - Having now kicked a few possibilities around on Project Chat, can we go for creating badges with:
There is some sense in what MisterSynergy says, but I also think there is sense in what @deryckchan wrote in the RfC (here), namely that there still ought to be some warning if someone tries to sitelink to a redirect page, and the user should be made to actively confirm that they didn't want to instead link to the redirect target. Otherwise I could see us ending up with a lot of accidental sitelinks to mis-spellings, which the present system mostly keeps us safe from.
Mon, Oct 14
Thanks Lydia. I've started a thread at Wikidata:Project_chat#Badges_for_sitelinks_to_redirects to quickly see if there are particular icons people would prefer.
The badges, and corresponding bot, might be a nice quick win for Wikidata's 7th birthday.
Mostly, the existence of a sitelink to a redirect indicates a potential data problem on Wikidata: a sitelink that has been left over when two Wikipedia articles have been merged, but no corresponding merge has been made on Wikidata. A sitelink to a redirect can therefore be a strong indication that an item on Wikidata is a duplicate of another one, and should be merged with it. However, at the moment it is not possible for somebody browsing a Wikidata article to readily spot that a sitelink points to a redirect. A badge to identify this would bring the fact into plain sight.
Sat, Oct 12
For discoverability, maintenance, and reuse, it is as important to be able to store metadata for datasets in SDC as it is to be able to store metadata for images.
Are we any further forward with this?
Fri, Sep 27
Thu, Sep 19
On a separate but related issue: we now have quite a lot of images of old maps on Commons, with coordinate georeferencing allowing the maps to be "warped" to standard coordinate systems. It would be nice to be able to serve the warped versions of the maps as tilesets, allowing users to compare different historical representations of given places as layers.
Sep 18 2019
Yeah, this is a problem. I recently had a new Wikidata item (Q66458942) deleted, because the admin wasn't able to see that it was being used as a value in SDC statements on several files, even though he'd checked https://commons.wikimedia.org/wiki/Special:EntityUsage/Q66458942 (cf discussion on admin's talk page).
Aug 22 2019
The aim was to try to identify the different generic stages of the workflow in machine vision, and to think how each stage could be made as "pluggable" as possible, so how users could be made able to plug in their own approach at any stage, and report their results from that stage back into the overall pipeline.
Aug 17 2019
To link the original file to these objects, three new SDC properties are proposed:
Aug 16 2019
Quick update for those following from home.
So for example this afternoon @Miriam is going to be leading a workshop session introducing image classifiers and clustering algorithms, with a view that Commons users can start to explore writing their own large-scale machine-learning image analysis tools. It would be good if there was syntax in place so they could write the results of those explorations to Commons SDC, for that they were there and accessible for the community to then refine or extract or take further for each image. Similarly two days ago @Multichill led a small group session, to try to think through what was the generic pipeline and workflow for machine-learning contributions, and what kind of open framework was needed to support bulk contributions of that kind from allcomers and any set or subset of images. @Fuzheado too has been talking about some of the investigations he has been doing with machine vision and the Metropolitan Museum collection. All those voices, and more, I think would have useful input for this conversation.
Aug 15 2019
It would be good for unassessed suggestions for statements (and not necessarily just "depicts" statements) to be accessible via the SPARQL query service, in the same way as regular statements, but possibly set with a lower trust level.
Aug 14 2019
Resource page (under development): https://commons.wikimedia.org/wiki/User:Bertspaan/maps
Aug 13 2019
By the way, I have a Wikidata property proposal suggested for external georeferencer URL, since a basic thing we will want to record is whether an external service has georeferencing for an image, and if so what the relevant URL is.
Aug 12 2019
@bert I'll be there; I'm coming in on the Tuesday afternoon flight from Edinburgh, and then I'll be at the Comfort Hotel Xpress Stockholm Central.
Aug 6 2019
@Bugreporter : T225778 "Define canonical URI for EntitySchemas" was opened a few weeks ago as a specific ticket for what the canonical URI should be, so I've added that in as a subtask for this ticket.
Aug 5 2019
A couple of issues:
Aug 1 2019
I do agree with @Legoktm that it would be nice to have a service where one could record and browse one's own and other people's recent queries -- and perhaps tag them into particular categories of interest.
A core point of this service was to be to able to share long queries. (I'm guessing WDQS now accounts for the majority of organically requested shortenings?)
Jul 29 2019
It would be nice -- and not just for depicts statements -- for there to be an additional rank, beyond "preferred", "normal", and "deprecated", so that statements could be given the rank "suggested by machine, but further confirmation desired".
Jul 27 2019
@bert Interesting proposal. But for me it raises some issues. Firstly, about the business case for it. Secondly, regarding implementation.
Jun 25 2019
Agree with @alaa_wmde that no requirement's yet been made out for having structured data on EntitySchemas within Wikidata. But it would be useful to have the EntitySchemas themselves (not surrogate items for them) represented in WDQS, so that they can be queried. That is the subject of ticket T225701 "Add EntitySchemas to the Query Service". A federated service, rather than putting them in the main WDQS triplestore, would probably satisfy this.
Jun 21 2019
^^ I saw exactly what Wurgl reports, browsing long pages on en-wiki (eg the Fram case) in London. Now seems okay.
Jun 14 2019
According to this post on Wikidata-l, there is a standard RDF serialisation (ShExR) for Shape Expressions, with a test suite on the ShEx spec github.
The community has now given the thumbs-up to Wikidata:Property proposal/Shape Expression for class, to link a class item to the Shape Expression that members of it should conform to.
Jun 4 2019
May 22 2019
Looks like the issue may be in the code of :c:Module:Artwork, which @Jarekt has been working on. The sandbox version of the module, called by the sandbox version of the template, does not show the problem.
May 21 2019
@Tagishsimon the WDQS tag is appropriate though, as it includes the pipeline for getting statements into the WDQS triplestore, and also any corruption issues happening there. Thanks for creating the ticket! #
May 8 2019
The other workaround, of course, is just to copy & paste the URL into tinyurl, which seems able to take at least 4500 characters -- but that is something of a step back to square 1.
May 1 2019
Apr 28 2019
Apr 27 2019
I am finding this too.
Apr 25 2019
Apr 16 2019
@GoranSMilovanovic Overlap table for P 1367 (Art UK Artist ID) now showing about two-thirds of the overlap hits that it should be. (VIAF 6825, ULAN 6153, RKD 5582, ISNI 4240, Benezit 4643) vs true VIAF 11633, ULAN 10552, RKD 9769, ISNI 8237, Benezit 7529.
Apr 15 2019
@Envlh That's very nice. So for example, here are your comparison tables for
P 1367 (Art UK Artist ID): https://tools.dicare.org/properties/?property=1367&type=ExternalId
and for P 650 (RKDArtists ID): https://tools.dicare.org/properties/?property=650&type=ExternalId
When you've got the data sorted, a table showing the closest identifiers by Jaccard similarity, rather than total overlap, might be quite interesting.
The numbers in the overlap data table for P 1367 (Art UK Artist ID) are way off as well -- only one tenth of the VIAF and ULAN overlaps correctly reported, only one twentieth of the RKD Artist ID overlaps.
Jan 29 2019
Why are we baking in the assumption that only one Wikibase instance can be associated with a particular entity type?
Jan 16 2019
As a postscript to my comment two posts above, note that in such a scenario a Commons category page might well be associated with both an item on Wikidata (via a sitelink equivalence) and a local item on the Commons wikibase.
Jan 12 2019
See also: T180113 "Support the creation and use of volunteer tools that help to convert information in Commons categories to structured data"
In the context of this thread, it's worth recalling the ongoing wish from Commons users for Commons categories to be able to have their own local items on the Commons wikibase.
Jan 11 2019
Nov 16 2018
Hi @JeanFred . The point is not so much to link multiple versions of the same viewer, but rather to link to some of the different viewers available, for example here is the Alba Madonna in the "Universal Viewer" used as a default by the Wellcome Collection and the British Library and others.
Note that there has been some discussion on the IIIF mailing list as to whether a https://www.wikidata.org/wiki/Property:P1630 URL formatter should be added for the property:
Nov 2 2018
To support what Smalyshev said: occasional termporary update lag may not be such a high-priority issue; but prolonged or repeated update lag rapidly would be.
Oct 15 2018
Oct 12 2018
That's nice. I didn't know that service included Commons.
(Copied from T173346)
I'd like to note strong interest on two different occasions from two different people in the last month in a top-tier GLAM that I'm working with, about the possibility of being able to use Commons as an IIIF hosting service for high-resolution images, eg for serving tiles of old maps for geo-referencing to platforms that need to be able to access images from an IIIF service.
Not sure if this is the right ticket for this information, but I'd like to note strong interest on two different occasions from two different people in the last month in a top-tier GLAM that I'm working with, about the possibility of being able to use Commons as an IIIF hosting service for high-resolution images, eg for serving tiles of old maps for geo-referencing to platforms that need to be able to access images from an IIIF service.
Presumably (as the original ticket suggests), one option would be to set up a Commons Data Query Service (CDQS) as a distinct endpoint on a new server that had both Wikidata and CommonsData installed in a Blazegraph instance locally. This would allow the existing WDQS service to continue without alteration, while allowing the CDQS service set-up to be optimised, as we learn more about the sort of queries that people want to run.
Oct 8 2018
The 400 character limit can also be a problem for the titles of old books, maps, etc -- the original titles of these, as given eg in library catalogues, can get pretty long.
Oct 7 2018
See this query: tinyurl.com/y76l2bft for an example of how to draw lines on a WDQS results map, based on values from variables in a WDQS query (in this case, drawing the bounding boxes of some maps)
Aug 14 2018
Thanks @SandraF_WMF . I've started putting together some links at :c:User:Jheald/IIIF_180815 that I or Andy Mabbett could talk through, to give an idea of what sort of IIIF interaction is possible at the moment.
Aug 13 2018
A heads-up that "Wikipedia and IIIF" is the proposed subject for the IIIF community call this week -- see https://groups.google.com/forum/?hl=en#!topic/iiif-discuss/wy2uRl_ukJ0
Jul 16 2018
Interesting slide-show. But the fundamental problem -- as some of the attached tickets start to appreciate -- is that the key information that determines whether an image fits the specification being looked for is not going to be stored in statements just on the CommonsData item for the image in question, nor just on the Wikidata item for the thing it depicts, but is going to depend on statements distributed throughout the database.
Jul 14 2018
Well of course you're going to copy all the CommonsData statements to Blazegraph.
I'd say don't give up too easily. This is probably as good an approach as any. If the issues are structural, bots will fall prey to them in just the same way, just more slowly and more haphazardly.
Don't you think you should maybe talk to the community about this first ?
Another example, where such image searches may depend quite sensitively on query construction or query optimisation: Category:Grade I listed buildings in Bedfordshire.
The attached subtickets are an interesting read. They all seem to be based on taking the Q-number value of "depicts", storing it as a string in the text-search index, and then doing an indexed string-match for it. Of course first baby-steps are important, and this facility will be crucial to be able to confirm correct entry, storage, and direct retrievability of "depicts" values.
Mar 2 2018
One sidelight on this. QS2 is great (invaluable!) for adding new statements, but not so good for modifying existing statements, especially if they are currently heavily qualified or referenced -- at the moment, the entire statement complete with all qualifiers and references has to be re-created, even if one only wants to change one qualifier value, or migrate the property being used for the qualifier.
IIIF image extracts seem to be broken again: see eg the 'Virgin amongst the virgins' test page at Crotos
Nov 21 2017
Sep 18 2017
That looks so great now. Thank you! (And @Shonagon ).
@dschwen Any idea why the IIIF-based detail viewer is working for the images on this page of Shonagon's:
Sep 4 2017
Aug 21 2017
It would be good to make this work.
Apr 5 2017
Apr 4 2017
+1 on the need for this.
Apr 3 2017
To make this really useful, it would be valuable for a WDQS query to be able to display such shapes in the map view of query results.
Mar 24 2017
So you're saying, in effect, I should think of the strings being stored as a great big hash table rather than a B-tree, so there's nothing there that can help even STRSTARTS. And of course I know very little about the internals of BlazeGraph, whereas you've actually written modules for it. But I did think BlazeGraph uses B+ trees, which do specifically facilitate rapid in-order traversal and retrieval. But perhaps only to retrieve hashes, not strings?
Thanks for your speedy diagnosis. I've gone through and reverted and then un-reverted the ten edits by hand, so they are now fine.
Ouch. That sounds quite nasty. The only thing I can think of from the user side that was perhaps slightly different about this set of edits was they were made with QuickStatements while I already had a different QuickStatements run open and going in another window (a big batch of 10,000 corrections in capitalization of item labels, which took about two and a half hours to run, all of which went through fine).
Mar 13 2017
Hi Smalyshev, thanks for the comment; but if I can come back on your two objections:
Perfect. I would never ever have found that.
Another case where it could be useful to be able to cast calculated co-ordinates back to a geo:wktLiteral would be to allow AVG() co-ordinates to be calculated and plotted, in cases where SAMPLE() is not sufficient.
Trying to get round this, by just plotting the values in a scatterplot, without a map, also failed -- though that's likely to be an issue with the scatterplot routine. (Ticket T160325 opened).