Jan 15 2023
Dec 14 2022
The OSM Sophox service, which is based on WDQS, has or had a SERVICE wikibase:tabular which appears to have providing this capability,
so there may be code already in existence that could be merged, or at least used as a basis
Aug 27 2022
This came up a couple of days ago in discussion again on the wikidata Telegram channel, with several people wishing it was possible, since "displaying the whole url often breaks the table layout".
Aug 13 2022
Thanks @Aklapper I'll add it to that Cat-a-lot request page. One further question though: how can/should tickets be categorised as relating to search, if tagging them Discovery-Search is not appropriate?
Aug 12 2022
Possibly, checking that the new search allowed this functionality was a concern behind T275656 "Check impact of making mediasearch default on Visual File Change and cat-a-lot".
Aug 1 2022
Part of the expectation of an RDF-based system is that it should be easy to retrieve URLs of a particular form.
Jul 28 2022
A comment on the requirement
Removing a redirect badge from a sitelink that points to a redirected page is disallowed
Jul 20 2022
Also raised in the triage-hour discussion was ticket: T207705 "Implement the Extended Date/Time Format Specification" (EDTF)
As noted by @GreenReaper above, the Wikibase_EDTF wikibase extension should now give a solid basis for building EDTF support on wikibase, allowing EDTF strings to be input, validated, and rendered by the wikibase GUI, if we want to add properties with an EDTF datatype to Wikibase.
Jul 16 2022
Possibly related to: T187935 "Allow cross-slot access during HTML rendering"
As noted at the Data Quality online workshop last weekend, this lack of registration is also causing a problem for the community with the deletion discussion process on wikidata, as (unlike usage in wikidata statements), there is no warning given when an item being considered for deletion at wikidata is being referenced by SDC statements.
Jul 12 2022
Jul 11 2022
Jun 24 2022
Not a very happy situation, though, if there is no way in WDQS to retrieve a Julian-format date that is what is entered on the wikidata item and would be universally used on sources.
Jun 22 2022
Text used for the layers pop-up in the map view has also become tiny (ie the text from hovering over the layers button at the top right in a query like https://w.wiki/5L4s)
May 26 2022
It is also particularly annoying that at the moment one cannot even add the badge to an existing redirect sitelink, without the above error message and the edit being blocked. It would be really good to get this fixed.
Just to note that the proposed recommended user behaviour has not yet been implemented:
- GIVEN an Item
- AND a page on the client that is a redirect
- WHEN adding the page as a sitelink to the Item
- AND adding a redirect badge in the same edit
- THEN the sitelink and associated badge are stored (even if the redirect target is already used in another Item)
May 21 2022
Thanks for that clarification, Lucas, that's useful. So yes, I can use the "search" API instead: https://w.wiki/5BsM and successfully retrieve far more entries (??? albeit very slowly -- query took almost 100 seconds, just for 500 returns); but then I cannot restrict the search to just the labels of items. (I can restrict the search to titles of pages -- but for wikidata the titles appear to be Q-numbers, so that doesn't help). So I still can't get the items I need.
Previously raised on-wiki at
Apr 18 2022
Duplicate of T189423
Yes: it seems that where there are multiple results with the same coordinates, in the same layer, only one is shown, the others are suppressed
Apr 17 2022
(Note: previous comment inadvertently saved when only 1/3 written, so it may be needed to check web version (if not here already) for full text).
There are a few more examples of queries using the service that can be found by searching the archives of the Request-a-Query page, here, and a few more still if the search is widened further to all wiki pages on wikidata (here; limited to pages in English, to suppress translation duplicates).
Feb 18 2022
This would be difficult to do within WDQS (and a distraction from the main purpose of WDQS ?)
We really ought to be doing better than this
Feb 14 2022
@CBogen Can you clarify the 'decline' here?
Nov 15 2021
@WDoranWMF Will, could you clarify in what sense this ticket is "invalid" ?
Apr 6 2021
Re not removing a badge : note that it may be important for a user or bot to be able to change a badge, in particular from 'sitelink to redirect' (x) to 'intentional sitelink to redirect' (x), if the user determines that the redirect is valuable.
Mar 30 2021
Addshore's suggested way forward from 10 February seems very sensible.
Mar 15 2021
Probably an effect of T168341 , if the count values were not unique
I just got bitten by this, as described here at WD:RAQ.
Feb 22 2021
A couple of follow-up things.
Feb 20 2021
A bit more about the use-case. Early next month the external Viae Regiae project, with which Wikidata:WikiProject Early Modern England and Wales is closely co-operating, will start a mass participation effort to transcribe all of the places and placenames on several series of 16th and 17th century maps, like this 1576 Saxton map of Essex on Commons. (They will actually be using a higher-resolution copy, which we will be uploading).
Nov 26 2020
Sep 21 2020
Platform team's IIIF project homepage: https://www.mediawiki.org/wiki/Core_Platform_Team/Initiatives/IIIF_API
I've written up a bit about today's zoom call at Commons:Village Pump, here. Feel free to add / correct / amend as desired.
Sep 17 2020
Oh, now I've found it. The speech-bubble button at the top left of the view toggles a display of all of the annotations recorded on the image -- very, ''very'' nice work. Thank you!!
That Mirador view is very nice. I like the way that it is able to operate from an IIIF JSON manifest generated from the wikidata item for the underlying object (code built by @LucasWerkmeister I think), which then allows a description of the image to be displayed using the (i) button at the top right.
Sep 16 2020
Long-term subscribers to this ticket will be excited to see T261621 Support the addition of the IIIF API for Wikimedia projects regarding content partnerships created two weeks ago
plus this announcement that's just appeared on the IIIF community's IIIF-discuss list ( https://groups.google.com/g/iiif-discuss/c/r9yf2GnaF1U ) :
Aug 10 2020
Aug 3 2020
The answer is, we ought to remove the unused ones from the RDF dump too, as discussed at T258474
@dcausse It *does* hurt a person who is trying to make sense of the dump, because they will see all these unfamiliar prefixes declared that they may then assume there will be corresponding kinds of predicates or objects that they have to make sense of.
Aug 1 2020
Ticket description should be re-written.
In some ways it's quite nice that WCQS uses tinyurl rather than the w.wiki shortener -- at least it means there is not such a limit on how long the query can be. (T220703).
Jul 24 2020
It would be helpful if at least one of the rdf:type statements were retained, as they make it easy to select a subset of M-IDs for a query to work on
Most of the above prefixes will be unnecessary, unless we propose to create any new properties local to Commons not defined on Wikidata.
Given that mediainfo items are just of the form sdc:M12345, is much meaningful autocompletion for these actually possible?
It also might be worth making the M-ID number prominently visible on the structured data tab of the filepage, given that this is where the information related to that ID is shown.
@Lucas_Werkmeister_WMDE : I think it would be more feasible to add the Special:FilePath URL to the WikibaseMediaInfo RDF, and combine WDQS and WCQS that way
PS. It's also stupidly hard to find the M-ID from a WikiCommons file page at the moment. This would be a good thing to display in the "structured data" tab there, I think.
I think you meant
wd:Q123 wdtn:P18 sdoc:M6919529
in that second line ?
IMO the best solution here would be to add triples of the form
Mar 17 2020
Yes, there's a cost to you of providing a service based on current WDQS, that then has to be ripped out for a new version based on WDQS 2.
Mar 16 2020
Engineering a completely new search facility for Commons Data rather than using SPARQL is a *stupid* *waste* *of* *time* *and* *resources*.
Mar 4 2020
Mar 2 2020
That's a bit of a problem, given what the badges are for...
I tried to add an "intentional sitelink to redirect" badge on the English sitelink for asteroid 6765 Fibonacci, but got "Could not save due to an error. The save has failed."
Feb 28 2020
regarding editors : it looks like some works have multiple editors, none of which have Wikidata items. These ought to be given in separate statements distinguished by "series ordinal".
But it may be that because the statements both have the same 'main value' (or, rather, they both have somevalue for the main value), the software used to add the statements may have coalesced them together. QuickStatements tends to do this, I think.
- "named as" = name given to the subject of the statement
- "stated as" = how the object of the statement was stated
Feb 19 2020
@Lucas_Werkmeister_WMDE The qualifier "stated as" (p1932) is currently used on 6.6 million statements. I couldn't get a query to complete to count how many of those statements have an object that's a blank node. My guess might be on the order of about 10,000 but that's just a number pulled out of the air, not based on anything. Could be a *lot* more, if this mechanism has been used eg for scientific papers with unmatched editors, publishers, etc.
Feb 11 2020
Feb 8 2020
Example of a Listeria tracking page, counting how many blank nodes are being used this way for the properties used on a particular set of items (in this case: a particular set of books, where the publisher (known) may not yet have an item, or at least not yet a matched item): https://www.wikidata.org/wiki/Wikidata:WikiProject_BL19C/titles_stmts
Please don't think or refer to the blank nodes as just "unknown values".
Jan 16 2020
Thanks Matthias, but that doesn't seem to help with the use-cases above, if a claim is just sitting there as a claim, and not being used by any Lua template (as most "depicts" claims probably wouldn't be?)
Dec 9 2019
Lead ticket for Vue migration for Wikidata would appear to be T157014 . After sustained activity in 2017, followed by a short spike in June-July 2018, it's not clear how much further progress has been made, or is currently anticipated on this. There is a mention that the Lexemes roll-out in 2018 included some Vue templates and widgets with PHP server-side rendering, which was going to be reviewed.
WMF projects use OOUI and org-wide design principles. Wikibase does not.
Dec 7 2019
Given the endless, and seemingly ongoing, difficulties the decision to fork the wikibase UI is leading to, and seems to be continuing to produce -- most types of statement still not available, somevalue not available, deprecation not available, references not available -- at what point does it make more sense to withdraw the forked UI as a costly experiment that hasn't worked, and is causing more trouble than it is worth?
Nov 29 2019
@Pintoch That's great. I'll try and get matching these this weekend. Is there any chance of the full dataset, beyond the first 1000? We currently have 1660 items for bodies of water in the UK (plus more that possibly don't have P31s), so it would be nice to be able to try to match them all. But thanks again for this!
Nov 23 2019
If maxlag is to be based on the maximum lag of the pooled servers, will there be active measures to monitor these, and take any really badly lagged server (ie significantly worse lagged than any of the others) out of the pool, and out of the maxlag calculation, to give it a chance to recover ?
Nov 15 2019
I would love it if somebody would take this on. I think it's a really worthwhile resource to link to, and a really significant topic to improve our data for.
Nov 14 2019
Hi @Pintoch. Thanks for this.
One thing that seems odd (to an outsider like me who knows very little about the system) is that some servers seem to be performing so much worse than others.
Oct 31 2019
@Lucas_Werkmeister_WMDE How dependent is the wikibase constraint system on SPARQL ?
Maarten was just suggesting that a functioning SPARQL service is required for any of the constraint checking to operate (and so this ticket would be completely blocked until CDQS is reliably up and running)
Is that correct? Or are any of the simplest of the constraint checks (eg format constraints, one-of constraints, etc) available without SPARQL ?