So you're not really asking for a change in the JSON-LD, you're asking for wikidata to only emit a single entity on the /entity/* endpoint -- and that would/could apply to all the different representations using the purtle backend. That's not a JSON-LD specific request.
@daniel We could do something similar for stubs. with structures like:
Not a problem but a cosmetic proposal: Instead of having the structure:
@Reedy Should we close this task? The related change have been merged
Tue, Oct 9
Wed, Oct 3
For example if the call to mapframe is:
Fri, Sep 21
It seems to me that the only thing missing in the implementation compared to https://www.mediawiki.org/wiki/Extension:WikibaseLexeme/RDF_mapping is the addition of schema:inLanguage for Lexemes. But it is derived data so it should not block the deployment to query.wikidata.org.
Wed, Sep 19
Thank you @Pintoch for raising this idea. For my fixing constraints violations project, I can mine violations from history offline so I do not really need this feature.
Sep 14 2018
@Jonas Thank you for your feedback.
Sorry everyone for the troubles. I was experimenting with a tool that tries to find corrections for constraint violations.
I have modified it to send a proper User-Agent for all its requests to the Wikidata API but not restarted it.
Sep 4 2018
Aug 30 2018
Aug 25 2018
Aug 23 2018
I would wait for addition of proper statement serialization that is tracked by T195043.
Aug 22 2018
Aug 21 2018
@dbarratt Thank you for planning to work on Wikibase+GraphQL.
Aug 15 2018
@Ankry the fix is planned to be deployed soon. Have some real pages still affected is a good way to check that the problem is indeed solved.
Aug 10 2018
@Tpt any idea how can we create propoer header/footer here (if this is the source of the problem)?
Aug 8 2018
It seems that the two pages could be parsed as JSON by PHP and so, ProofreadPage assumes that they are using the JSON serialization for Page: pages. But, because they are not using the good JSON serialization format, an exception is thrown.
Aug 4 2018
I have just made a change that uses LinkRender instead of Linker and, so, should fix the possible XSS injection https://gerrit.wikimedia.org/r/#/c/mediawiki/extensions/ProofreadPage/+/450398/
Aug 3 2018
I also checked about the potential SQLi and I don't see how it could happen. All "conds" arguments are key-values where the key is a string so should be properly escaped. More values seems to always be an integer or an array of DBKey obtained by using the Title::getDBKey method. The error are maybe raised because the query is constructed in a different class than the one where the execution is called.
Aug 2 2018
Also, I understand it's still disabled on both wikidata and test.wikidata.
There is still one feature missing: outputting statements on form and senses when the RDF representation of a lexeme is required (e.g. on Special:EntityData/L12.nt).
Is https://gerrit.wikimedia.org/r/449718 covering this?
Aug 1 2018
Jul 28 2018
Jul 27 2018
Indeed the set of property and their datatypes is very static and cachable so we could use them as keys of a StatementByProperty object and then have StringStatement, StringSnak... types. The object would just be huge and maybe raise some performance problems of the various GraphQL tools (we would have 4K+ keys).
Jul 21 2018
@Tpt so it looks like right now you can't get a datavalue or recursively call item from a statement. I added a sample query to the task description.
Jul 14 2018
Done as part of T153120
Jul 13 2018
@Billinghurst Is the problem still happening now that the quality color bug is solved?
Jul 12 2018
The fallback to the categories seems to work. I plan to keep it as long as their exists Page: pages that do not contain the pageproperty
Jul 11 2018
Sorry, I was wrong, the fix is not deployed yet. It should be deployed this evening UTC.
As a logged-in user, it will also show normally/as expected.
Jul 10 2018
@Tpt it does not seem to work: https://fr.wikisource.org/wiki/Livre:Revue_du_monde_nouveau,_vol._I,_1874.djvu
The change that should fix this problem have been deployed yesterday. A purge of the pages where blank ids are displayed should fix the problem (for Index: pages it could be done with a not null edit on MediaWiki:Proofreadpage_index_template)
Jul 7 2018
This is a side effect of T198470. A fix for it should be deployed next week.
Jun 30 2018
Fallback have been implemented and merged into ProofreadPage. It is live now on https://en.wikisource.beta.wmflabs.org and will be deployed on Wikisources next Tuesday.
Jun 29 2018
It's probably because of change rEPRP502ff8adeddde1749001df704e0f389c37cb6e5e that uses a page property for the quality lookup instead of the categories. It's much leaner and it allows to have the same storage for all Wikisources independently of the quality category name. The page property have been introduced by rEPRPec94d4f460c1221a4f794a32e8e3d6b677d326d4 last September so I hoped most pages would have been purged but it seems it's not the case...
Jun 19 2018
There is still to implement output of forms statements when the RDF representation of a lexeme is requested (i.e. Special:EntityData/L42.ttl) and the schema:inLanguage relation for lexemes.
Jun 12 2018
Merging the current namespace fix in the next SWAT looks like a sensible thing to do to have the namespaces properly working.
@Urbanecm Hey, it would be much better to not allocate custom namespace ids for the Page: and Index: namespaces but just let ProofreadPage setup the two namespaces with the proper names with the standard namespaces 250 and 252. I just did a change to ProofreadPage for that: https://gerrit.wikimedia.org/r/440010