Fri, Oct 19
T206913 was an appropriate fix :)
This has not failed us in a while, closing as Resolved.
I’ll close this as Resolved since we tackled this with T153746: ErfgoedBot categorisation process should ignore hidden categories.
I actually (tried to) did that on purpose... except for the on top of other content ^__^
It is now grouped by source page. I believe this was done in ac7ab3688b (deployed in July 2018).
This looks like fixed in https://tools.wmflabs.org/heritage/api/api.php?action=statistics&stcountry=ua&format=html ?
Wed, Oct 17
Tue, Oct 16
Thanks for your guidance @hashar !
Mon, Oct 15
@Aklapper Would you be able to provide guidance on the proper process to either have permission to do this, or request to someone who does?
Sat, Oct 13
Fri, Oct 12
Work in progress output : https://commons.wikimedia.org/wiki/User:Jean-Fr%C3%A9d%C3%A9ric/Monuments_database_Statistics_fr_fr
Tue, Oct 2
Thu, Sep 27
If CommonsDelinker edits are problematic on Wikidata, can’t it just be blocked there until the bugs are fixed?
Mon, Sep 24
Sep 21 2018
Latest report has some 17 empty sources :) https://commons.wikimedia.org/wiki/Commons:Monuments_database/Statistics
Sep 20 2018
Sep 18 2018
Resolved this at the same time as T204427.
Sep 17 2018
The country name is not consistently used across links, pages etc
Sep 16 2018
Sep 13 2018
Sep 12 2018
I have been thinking about how to go about that for a while now (took me quite a while too as I was confused by all the wrapping that pywikibot does on top of logging...)
FYI, they recently revamped the Phabricator extension for Sentry (cf. https://github.com/getsentry/sentry-plugins/pull/373) − should make it to the next Sentry release (unscheduled yet).
Sep 11 2018
We tried running harvesting just on gb-sct, and it crashed again (so it’s not due to some weird timing linked to full harvest).
We noticed a lot of warnings for missing prim keys.
Oh, of course, this does not make use of the virtual environment >_>
Can’t reproduce locally…
$ docker-compose run --rm bot python erfgoedbot/populate_image_table.py Starting labs-tools-heritage_db_1 ... done Working on all countrycodes Found 96 countries with monument tracker templates to work on
Been thinking about this. If we have diagnosed this properly, the issue is that categorize_images locks the table for its entire run. Indeed the code is:
The workaround solution is now now working per T202281: no jstop command available in a GridEngine job
I would have hoped the tests would run faster after this, but a time tox -e py27 still yields ~30s :-(
Sep 10 2018
Sep 9 2018
Confirmed in later categorisation job: https://commons.wikimedia.org/wiki/Special:Diff/319422336
Sep 8 2018
$ crontab -l */15 * * * * jsub -l release=trusty -mem 1000m -once -j y -o /data/project/wikiloves/logs/update-database.log -N update_database_high /data/project/wikiloves/wikiloves/bin/update_database.sh monuments2018 >> /data/project/wikiloves/logs/crontab.log
Sep 7 2018
Ok, smth must be off, because it’s fairly unlikely that on https://tools.wmflabs.org/wikiloves/monuments/2017/Germany, 39 images could be uploaded on the first day by 0 joiners :p
Daily data is now exposed at https://tools.wmflabs.org/wikiloves/monuments/2018/France
Sep 6 2018
Yeah, I did not find how to integrate Shellcheck but bashate is good enough for now :) Thanks!