Also it does not process "baserevid" parameter and does not return entity->lastrevid value.
Aug 13 2018
Apr 21 2018
Mar 17 2018
Bot parses full Wikidata dump (844 GB of XML files) and load items and its properties into memory. This in memory data is used for reports generation. Now Wikidata has ~47000000 items. So my code uses ~144 bytes per item. I can not load only part of data because dumps parsing is long and sequential process (~5 hours, 4 threads are used).
Mar 14 2018
Aug 24 2017
About KrBot: current implementation fully ignores references and ranks. Qualifiers are checked only by https://www.wikidata.org/wiki/Q37845003 and https://www.wikidata.org/wiki/Q21510863. Also https://www.wikidata.org/wiki/Q19474404 constraint uses qualifiers if https://www.wikidata.org/wiki/Property:P4155 is specified. In other words: qualifiers are not checked too.
Aug 23 2017
All other constraints store settings as qualifiers on property page. I think it is good practice make all constraints as similar as possible. This makes implementation and understanding of full constraints system easier.
Aug 22 2017
The most constraints are fully parametrized. Algorithms do not have constantly defined properties. The only exception is P31/P279 in Type/Value type constraints. Some users think that P31/P279 must be parameters too. The second thing is OR aggregation. This makes the algorithm indeterministic a bit.
Jun 12 2017
Format constraint violations can be autofixed too. See
ConstraintMigration is not needed for me. I use another code.
Jun 2 2017
I confirm, code for the migration is ready to run. I am waiting for wbcheckconstraints API ready status.