Mon, Jan 11
Appears to be a duplicate of T253053: Make it possible to add references to statements in Structured Data on Wikimedia Commons ?
Appears to be a duplicate of T230315: Allow adding references to structured data (MediaInfo) statements ?
Dec 16 2020
As of today December 16th, M8979671 is still present in that query’s results.
Nov 24 2020
Nov 13 2020
Nov 3 2020
Oct 20 2020
Even logged-out, only the front-page seems accessible, eg https://quarry.wmflabs.org/query/runs/all returns 500 Internal Server Error
I can confirm this behaviour − noticed it a few hours ago.
Oct 15 2020
Thanks for the answer @Trizek-WMF,
There is at the moment significant friction on Wikimedia Commons: as part of StructuredDataOnCommons, a couple of bots are making millions of edits. As they are marked as bot edits, users can filter them out, but some users do want to audit bot activity on files of interest. User would thus like to exclude edits from these two bots only.
What is the status of this?
Oct 9 2020
Oct 6 2020
I can confirm this.
Oct 3 2020
Ah ah − turns out the uploader’s username has been oversighted.
Oct 2 2020
Because of T264217 , several datasets disappeared and could not be recreated. I paused the general nightly update and triggered by hand update for a handful of datasets − including monuments2019.
Sep 30 2020
Sep 18 2020
Sep 14 2020
Sep 4 2020
Thanks for the ping :)
The configuration had been helpfully done by @Romaine (Special:Diff/446490283) ; I had forgotten I had disabled the general upgrade cron the other week when investigating issues with the Wiki Science Competition data...
Aug 28 2020
Aug 27 2020
(Quick not that this error was mentioned in https://www.wikidata.org/wiki/Wikidata:Project_chat#Countries_vs._sovereign_states)
This seems reasonable ; indeed WLE 2020 Ukraine achieving 100% usage is probably not what is expected :)
Aug 21 2020
Aug 19 2020
Let’s close ; these things sometimes happen with new configurations pending a webservice restart.
Aug 15 2020
Unassigning myself, leaving it to @TheDJ :)
Aug 6 2020
Jul 21 2020
Redis caching has been live for a while now ; follow-up work can be filed as other tasks.
Jul 15 2020
Thanks @bd808 for the investigation and hints!
Jul 14 2020
- crontab invokes jsub with the run.sh script (jsub -mem 1000m -once -j y -o /data/project/integraality/logs/update.log -N update /data/project/integraality/integraality/bin/run.sh)
- sources the virtual environment (via /data/project/integraality/www/python//venv/bin/activate)
- runs the python script (via python integraality/pages_processor.py)
Jun 18 2020
(Just lurking here ^_^)
Jun 4 2020
Tentatively fixed by altering the .lighttpd config ; let’s see what users say.
May 21 2020
Tentatively closing as invalid − something must have been fixed somewhere by someone :)
May 19 2020
May 16 2020
I updated a bit https://commons.wikimedia.org/wiki/Commons:Monuments_database/Harvesting − it’s not the best docs but looks fairly up to date. The one really missing part is how to configure Wikidata-based sources.
May 11 2020
@bd808 Good point re:toolinfo.json, forgot about that. Added at https://tools-static.wmflabs.org/wudele/toolinfo.json and it’s now showing up in https://tools.wmflabs.org/hay/directory/#/search/wudele
During the hackathon I emailed back and forth with the original author, Raquel Smith. She kindly accepted to license it under MIT license \o/
May 10 2020
- Service accessible at https://wudele.toolforge.org/
- Code at https://github.com/JeanFred/wudele-toolforge/
- Some background/recap on https://commonists.wordpress.com/2020/05/09/wudele-a-framadate-instance-in-the-wikimedia-cloud/
May 9 2020
As far as I can tell, this is resolved − please reopen if there are additional issues.
Closing this as resolved, as most of the functionality is there. Follow-up work can be filed as separate tasks.
May 6 2020
Good catch, fixed in 1a7a6c8 and kicked off another update: