Jul 29 2020
I think there are at least seven advantages:
- File description pages are easily visible to all users of Commons files (on Wikipedia when one page is displayed, when searching on Commons, at Wikidata, etc. ).
- The pagination on the file description page allows to view a page of a book easily.
- Many djvu/pdf files available on Commons are not transcribed at Wikisource and may never be. This is especially true now that ten thousands of Files_from_the_Biodiversity_Heritage_Library are being mass-uploaded.
- Even if just the index page exists (e.g. sample), it's unclear if having only that part there is really helpful for users of the file (if no transcription follows). Some Wikisource might not want otherwise unused index pages (at least some language versions, I'd guess).
- Pagination of a djvu/pdf is file metadata. As such, I think it should be included also as structured data on Commons.
- Most other fields of index pages are already available on file description pages.
- It might get Commons contributors to added data later useful for Wikisource contributors.
Jul 22 2020
May 31 2020
Whatever the approach chosen to determine stubs, it could set the status as a badge on Wikidata .. this can then be queried easily.
Jun 27 2019
@Rotpunkt I think this might be the simpler solution to contributors' problems than the one proposed a year ago, but not implemented.
Wikidata seems to lag in terms of implementation of https compared to other parts of WMF. Can we do something about it?
Jun 26 2019
It is impossible to get back the original, human-readable LaTeX string from some sanitized version.
I don’t see how that’s related to this task.
Somehow I think the task as it's currently worded above is too complicated to solve the initial problem. The idea is that an input of
leads to a triple that contains only:
as would it be if the datatype was string.
To avoid that people confuse claims and statements in general, maybe the feature should use an entirely different name.
I'm trying to figure out what the volume of queries on WQS may be:
For the sake of clarity, maybe it should be mentioned that the presence of a psn: triple wont impact users of wdt: or ps: triples.
Oh.. didn't know that. Which one is the largest one?
outputting the input string "IS" simple
Is is just an impression or do we now get redirected through http: when clicking on a Wikidata item on results on a https-connection WQS?
Maybe some of the features Pasleim wrote for https://tools.wmflabs.org/pltools/recentdeaths/ could be included.
Maybe a simple way to implement this could be a psn: triple with the string.
excellent. Works fine. Thanks.
Jun 18 2019
Marking this as stalled for now with a note to reopen once AMC is more widely available.
Wikidata should be accessible to mobile Wikipedia readers, not just advanced mobile contributors.
The idea is that one could navigate to the mobile view of Wikidata. I don't actually suggest that people should use a mobile to contribute to Wikidata nor be required to log-in to view Wikidata.
off-topic: what is cool is that it displays images (which www.wikidata.org doesn't)
Why is there such a "need"?
For harvesttemplates, it would be great if this could be queried/defined somewhere
We could use the https://www.wikidata.org/wiki/Q64335281 approach. Also solves the problem of structured data on EntitySchemas.
A problem we had with the Russian gadget is that people try to "update" information from the client instead of adding a new statement, e.g. change 2019 to 18 June 2019. In general, this isn't much of an issue for this type of edit except when the statement on Wikidata includes a reference stating "2019" and not "18 June 2019". Even a typo could be something that requires a new statement.
Jun 15 2019
Jun 13 2019
Simple solution could be to use items instead, see https://www.wikidata.org/wiki/Q64335281
I changed the sample in the description above.
+ check items against EntitySchemas
As PetScan has some troubles these days, I'm using query server/mwapi:generator "categorymembers" again.
May 31 2019
Wikidata has most of that: https://www.wikidata.org/wiki/Wikidata:Database_reports/WMF_projects
May 29 2019
Maybe it should be a Wikimedia-Site-requests or an interwiki one.
np. I made E27 directly .. bad news is: now I have to come up with items that match the scheme ;)
good point .
We used to have an url property that was suitable for that .. not sure where it went.
May 28 2019
Somehow I like the summary. Anyways, here it is for future reference
May 26 2019
I'm not sure if that can be sorted out. There seems to be lots of politics and possible conflicts of interests involved, none really related to the simple issue at hand.
Currently, I don't think the constraint system allows to limit a constraint scope to items only.
The IPA statement on Q102090 isn't really a sample to follow.
I don't think the wdtn triples are needed. I'd just add the "wikibase:identifier" ones.
May 25 2019
P407 isn't needed on lexemes. Even, I asked them not to add it.
You'd need that if most content you want to enter is directly in UPA
Do you want to create codes that allow entering UPA directly? e.g. "sms-fonupa" ?
The current (above) format of the triples (when it works) seems fine from a Wikidata perspective, but from a LOD perspective, shouldn't the triple just be something like:
Suggestions are better than before, but I think a review should be done to see what is still open and what could be improved further.
Maybe you want to add support for replacement_property as well.
May 23 2019
For (non-contemporary) people, I think an interesting grouping could be by century, but I'm not entirely sure how that could work without a dedicated statement.
Not sure if this is relevant here, but the most frequent unhappy paths I notice are:
It seems to using grouping_threshold .. I noticed that when I started using a fairly high one.
For grouping by some sub-national level, maybe a separate query to get levels first is the most efficient. A shorthand for that could be a P31 statement or a property (if there is one).
Anyways, thanks for adding so many of the suggestions. I'm aware that it's usually up to the coders to pick them.
maybe items with Q5 could have separates stats.
May 21 2019
Some not so random suggestions:
Independently of the above, is there a way to use CONSTRUCT ?
Alternative: query by units. Sample: https://www.wikidata.org/wiki/Property_talk:P6591#Query_with_%C2%B0F/%C2%B0C_conversion
May 19 2019
What is needed to run the generator just once, at least once for a given variable?
Apparently codes for lexemes are distinct from monolingual ones. We could solve this first and wait for the other to be sorted out . or whatever.
Interesting. "mad" outputs more than "max" ;)
May 18 2019
Not that it changes the problem, but the current URL should have preferred rank and the former url normal rank. See https://www.wikidata.org/wiki/Help:Ranking#Deprecated_rank