Sat, Jul 22
Grr, checking the statement ID now works, but the example uses the wrong capitalization for the item ID…
Ugh, I should’ve remembered that, I’m pretty sure I ran into that problem before… thanks.
Fri, Jul 21
By the way, the affected properties are: taxon name (P225), category's main topic (P301), Commons category (P373), INSEE municipality code (P374), IMA status and/or rank (P579), GNIS ID (P590). Any item with a statement for one or more of these properties gets the error.
@daniel wow, thanks… I’ll try to account for both possibilities so the test works locally and in prod. (I assume the CI systems also don’t have strict mode.)
That value is truncated in the database as well (length is 65535), because the field is a blob… I guess it should be a mediumblob instead.
For now I guess discarding (and logging) invalid values would be ok. But mid-term the field should be altered or overly long values need to be forbidden (which might also be a good idea considering performance?).
And it looks like some escaping in a format constraint for P1793 (grep for 2f15c2589f391670c05a92158acb29bb98969d7e) might be incorrectly escaped.
Hm, line 4367 (grep for 0f3eaaa5603250aa8f95ef50396ac73dfef58339) seems to be truncated. Did that happen during the table dump or is it truncated in the database?
Thanks, I’ll see if I can find anything suspicious in the table.
Okay, the new version of the fix also includes logging, so that should hopefully give us more information too (if you agree that the logging is a good idea and backport the fix with it included).
Seems to work a lot better now – at least, books are now properly recognized as works. (Note that Douglas Adams and some other items currently don’t work at all due to T171295: Fatal error: Argument 4 passed to WikibaseQuality\ConstraintReport\Constraint::__construct() must be array, but that’s unrelated.)
I uploaded a quick’n’dirty fix, but I have no idea why it’s necessary, and I’d like to understand that. Are there any constraints in the table with constraint parameters "null", or with invalid JSON, or with an empty column?
Do you have any other information? Stack trace, anything?
I think we can close this now, constraint statements are enabled on Wikidata and were also imported from the existing statements.
Aaand now the API and Special:ConstraintReport give me HTTP 500 on Q42… :(
Once someone runs the scripts in T169647: Enable constraint statements on Wikidata, this can be closed (though I’m a bit sad we told the community that we could do this before the full migration and then nothing happened). Did you do that?
Thu, Jul 20
Fix is deployed and special page and API work again – thanks a lot aude and Reedy!
Yay, thanks aude and Reedy! :)
UBN! because it’s happening right now on Wikidata for most constraint checks. Note that T169647: Enable constraint statements on Wikidata will hopefully mitigate this, since the exceptions in statements are hopefully not broken (⇒ no ConstraintParameterException).
I don’t think anything has been backported for this issue, so I guess we’ll know for sure later today when the new build is deployed. (Where “we” is someone who has access to the logs, i. e. not me :) )
Wed, Jul 19
IRL discussion result: add another status, but make it “warning”, which is the new default, and “violation” then serves as the status for mandatory constraints.
One possibility would be to add a new status, “mandatory violation” (or “severe violation”?).
Here’s what it looks like with the two linked changes:
Tue, Jul 18
Result of IRL discussion:
These entries are shown when there is a problem with the constraint definition statement itself, e. g. when a “conflicts with” constraint statement doesn’t specify the other property that a property conflicts with. Those problems are shown in two ways: on the constraint statement itself (see T169531 – I just added a screenshot there), and on every statement of this property on other entities.
By the way, here’s how this looks:
Here’s what it could look like:
Note that the gadget doesn’t have the current revision ID after the save, so the API has to do the staleness check and return it as part of the result. The gadget would then periodically retry until it gets a fresh result.
We can determine that we got a cached result, but I don’t think we can find out if any related entities were edited since then. Do you think adding a disclaimer like this would be okay?
Thanks, then it makes more sense that the change I’m about to submit only touches views :)
I think I just found the right config file in puppet – can I just submit a Gerrit change there?
Oh, I guess 1 is converted to that URI when exporting to RDF. Nevermind then.
This problem isn’t introduced with this change, but shouldn’t the default unit be http://www.wikidata.org/entity/Q199 instead of 1? See T167565: Wikidata allows invalid URIs to be entered as units.
Mon, Jul 17
Done; I opened the separate task T170811: Check constraint parameters after constraint statement is saved for the remaining unmerged change (needs more work).