Wed, Jan 15
Related or not, sometimes I get the generic error screen instead…
Tue, Jan 14
I think the reasonable default option is the talk page, thinking of all Wikibase instances. On Wikidata this won't be the most efficient possible solution, but it may serve until the community discusses and agrees on a more definitive solution for Wikidata, IMHO. Right now we have https://www.wikidata.org/wiki/Template:Edit_request. If the URL were defined by a MediaWiki message, the community would be able to change the default link (e.g., https://www.wikidata.org/wiki/Talk:Q42) to a more specific Wikidata link (e.g., https://www.wikidata.org/w/index.php?title=Talk:Q42&action=edit§ion=new&preload=Template:Edit%20request/preload).
Thu, Jan 2
Dec 15 2019
On Q78347074 we have two cases: family name and given name. Purging both the Item page and the property pages doesn't fix the issue.
Dec 5 2019
Side note/spoiler: This month I expect to sound out users in the context of the analysis on property constraints, with which we'll know what constraint types users most want, considering this one, the rest of subtasks of T213803, and others.
Nov 19 2019
Oct 19 2019
Oct 15 2019
Oct 14 2019
Yes. But do Wikibase installation owners and technical people know how to escalate those issues? I think they're escalating them but through private channels, which implies that, if the problem is theirs, the next user with the same problem doesn't find anything in a web search or in the documentation and asks again; and, if the bug is in the software, it is sometimes forgotten or bypassed, or even the user decides to change the use case, before the bug translates into a Phabricator task.
Oct 13 2019
The status is automatically given by the software after carrying out some edits. By "care about Wikidata" people usually mean learning what Wikidata is about by carrying out those edits. In that case (the usual), the flag request is unnecessary.
(By the way, we can consider lowering the threshold of the number of edits if we conclude that it's too high, although it was actually increased a few years ago for the opposite reason; I was not involved in the process and, honestly, I don't remember what that threshold is right now.)
Oct 11 2019
Oct 10 2019
Usually editors have two opposite views about this option. One is similar to what @Amire80 exposes: if we propagate the autoconfirmed statuses from Wikipedias to Wikidata, then these autoconfirmed users are free to work on their wikis autonomously, without having to care about Wikidata. The other view is that they should actually care about Wikidata, that the confirmed status is easy to get and that you already get it automatically after carrying out the few test edits necessary to know what Wikidata is about. Some people will say the first option pursues our goals and contributes more to centralization/harmonization, and others will say exactly the same about the second option.
Oct 8 2019
I want to thank you for finally solving this task. It's a great idea for users to generate content collaboratively and horizontally, but this model doesn't favor process efficiency, user experience or innovation, so these areas require skilled professionals who fight against tradition, stagnation, short term and subjectivity. Users hardly ever like change, but I do think the improvement is significant and right now I don't miss anything from the old version. Keep up the good work! :-)
Oct 7 2019
Sep 27 2019
Okay! But I'd need some help to know what/where to touch. To set HTTP headers for the Special:EntityData pages I would probably change /repo/includes/LinkedData/EntityDataRequestHandler.php, but what should I touch if I want to set HTTP headers for the wiki pages (not necessarily loading them through Special:EntityData)?
Thanks for the task, @Lucas_Werkmeister_WMDE!
Thank you both! :-)
Sep 22 2019
Thanks for all the work! I have a question: what dimensions of data quality (completeness, accuracy, consistency...) are you guys considering when you speak of "quality" in this scope? The term "quality" is a buzzword used by people to name things that sometimes have no relationship to each other, so I'm not sure what it means here in practical terms, I don't know what factors are included in the equation (and which are excluded and should be measured separately).
Sep 17 2019
Any thoughts on this?
Hopefully these needs will soon be met by https://etl.linkedpipes.com/ and OpenRefine has also added some features and improved its integration with Wikidata.
This was fixed a long time ago. Thanks!
Sep 7 2019
Thanks! Is this fixed now that wikibugs is online?
My usual behavior is (and surely will be) to open these kinds of links in new tabs even if they don't have a target="_blank". Which doesn't necessarily mean my behavior is perfect, of course. :-)
Sep 5 2019
The highlighting of statement groups ("properties") is now available on Wikidata. The highlighting of individual statements still needs approval on Gerrit.
Sep 2 2019
Aug 30 2019
Aug 29 2019
I think the chain of redirects isn't correct now. When I load /ontology with HTTPS there's a redirect that takes me to a HTTP page, and then another redirect switches to HTTPS again.
https://wikiba.se/ontology → http://wikiba.se/ontology-1.0.owl → https://wikiba.se/ontology-1.0.owl
Aug 22 2019
Aug 17 2019
How can we help?
Aug 15 2019
Aug 14 2019
Jul 31 2019
When requesting data via the Accept HTTP header at least the following values are supported: application/json, application/vnd.php.serialized, text/n3, text/rdf+n3, text/turtle, application/x-turtle, application/n-triples, text/n-triples, text/plain, application/rdf+xml, application/xml, text/xml, application/ld+json. These values are just aliases (redirects) to the URLs with the extensions, some of these values are duplicates, and some of them (e.g., text/rdf+n3) aren't media types recognized by the IANA either, but I do think these redirects are useful and guess they don't significantly increase the maintenance cost. In particular, redirecting some extensions to others might be simpler and more maintainable than what is done for header values, and in that case I would wonder why not give back to users what they want when we know unambiguously what it is.
Jul 26 2019
Jul 19 2019
Hey, thanks for the reference! I didn't know that task was open. But both tasks aren't exactly the same, this task is oriented to generate wikilinks automatically from every single entity ID that doesn't have an explicit link in the raw text. For example...