Pywikibot, Wikidata, i18n, GLAM stuff
Wed, Oct 17
Tue, Oct 16
What year? This is under the 2018 workboard
Mon, Oct 15
And the table should probably be renamed from gb-sct to gb-sct-lb because we also have scheduled monuments, see T207067
Last part is to check enwp template with our configuration and do some last tweaks (removing multiple options for id fields and that kind of things). I hope to be doing that soon.
And https://en.wikipedia.org/w/index.php?title=Wikipedia:WikiProject_Historic_sites/Unused_images_of_listed_buildings_in_Scotland&curid=35867555&diff=864167140&oldid=858922112 is suddenly much more work.
Sun, Oct 14
If this gets stuck on debugging on our side, I could probably ask around for contact details for the people running https://data.pdok.nl/sparql . On https://www.wikidata.org/wiki/Wikidata:SPARQL_query_service/Federation_report several other endpoints are also failing. Solving the pdok.nl problem might fix some of the others too.
@Smalyshev what do you think? I haven't run into this myself. My feeling is that case insensitive is probably better, but would that require a lot of work?
Based on the responses this task description needs updating to make clear what the end result should be.
Fixing the lists right now. Example edit: https://en.wikipedia.org/w/index.php?title=List_of_listed_buildings_in_Oyne%2C_Aberdeenshire&type=revision&diff=864025741&oldid=857873175
Thu, Oct 4
Same change as T194346
Sat, Sep 29
New property proposed at https://www.wikidata.org/wiki/Wikidata:Property_proposal/Wikipedia_suggested_article_name
Sun, Sep 23
If all the generators listed on https://www.mediawiki.org/wiki/API:Querypage follow the same logic in Pywikibot as the unconnected pages one, than probably all of them have the same issue. Maybe make a new subclass QueryPageGenerator that wraps around https://www.mediawiki.org/wiki/API:Querypage ?
Fri, Sep 21
Thu, Sep 20
Wed, Sep 19
Changed https://www.wikidata.org/wiki/MediaWiki:Collabpad to "CollabPad (but it is a special page)" and that's now all over the place.
Sep 13 2018
Sep 12 2018
Sep 4 2018
I checked several systems and both run 2.7.6. . This task is way too soon. Come back in a couple of years. You're going way too fast on this dropping campaign.
Sep 3 2018
https://gerrit.wikimedia.org/r/#/c/pywikibot/core/+/371659/6/pywikibot/site.py sure looks like it. Good to see it implemented!
Sep 2 2018
Aug 15 2018
Aug 4 2018
@Smalyshev / @debt :I think this is one of those tasks where we have a bit of a misunderstanding about scope (see https://lists.wikimedia.org/pipermail/wikidata/2018-August/012282.html ). Close this one as resolved and make clearly scoped follow up tasks to untangle this? :-)
Jul 25 2018
I don't think anything else is using it. I would check two weeks of logs to see if anything tried to use it and if not, just kill it.
Jul 22 2018
Yup, had this too. In https://commons.wikimedia.org/w/index.php?title=Campaign%3Apainting-pd-art-self&type=revision&diff=312102316&oldid=312090259 I fixed the license. This change didn't have any effect. I kept getting a broken message, see for example https://commons.wikimedia.org/wiki/File:Jos%C3%A9_Garcia_Ramos_-_El_ni%C3%B1o_del_viol%C3%ADn.jpg where it still uses the old template. Later I did https://commons.wikimedia.org/w/index.php?title=Campaign%3Apainting-pd-art-self&type=revision&diff=312135588&oldid=312102316 and that solved it almost instantly.
Jul 18 2018
Jul 13 2018
Jul 12 2018
Jul 10 2018
I ran into this because I imported data with was tagged as "NOR" and is a valid ISO 639-2 language code that maps to the ISO 639-1 "no" language code, see https://en.wikipedia.org/wiki/Norwegian_language . Norwegian is a valid macro language, see https://en.wikipedia.org/wiki/ISO_639_macrolanguage and wouldn't be the first macro language to include, we also have ar (Arabic) or ne (Nepali) as valid language codes.
Jul 9 2018
Jul 8 2018
Jul 5 2018
Looks like @Dalba introduced it in https://gerrit.wikimedia.org/r/#/c/pywikibot/core/+/440096/
Jul 2 2018
Jun 29 2018
Jun 28 2018
@Xqt : Why are you deprecating perfectly valid generators?
Jun 14 2018
So I stopped operating the bot back in 2015 because the time it would cost to fix it didn't add up with the (negative) feedback.
If there are multiple people who are willing to help out here I'm more than happy to invest a bit of time to set the thing up again.
Jun 10 2018
zo jun 10 11:44:09 UTC 2018
tools.heritage@tools-bastion-02:~/logs$ ls -alt update_monuments.log
-rw-rw---- 1 tools.heritage tools.heritage 96412359 jun 10 11:44 update_monuments.log
tools.heritage@tools-bastion-02:~/logs$ grep mysqldump update_monuments.log
mysqldump: Error 2013: Lost connection to MySQL server during query when dumping table monuments_am_(hy) at row: 13534
mysqldump: Error 2013: Lost connection to MySQL server during query when dumping table monuments_be-vlg_(fr) at row: 19365
mysqldump: Error 2013: Lost connection to MySQL server during query when dumping table monuments_be-vlg_(fr) at row: 56932
Jun 1 2018
May 30 2018
May 28 2018
Freenode implemented the iline. I didn't hear anyone complain about irc so I guess it worked.
@Smalyshev https://www.wikidata.org/w/index.php?search=%22SK-C-5%22 doesn't work yet, but this task has been closed. Can you explain? This is listed in the task description as something that should work.
May 24 2018
May 20 2018
This is part of https://www.wikidata.org/wiki/Help_talk:Property_constraints_portal#Improvements_for_2018 . Would start with just binary in the first version and maybe do a second version that looks for real references (and not imported from).
May 18 2018
May 17 2018
Confirmed on both Wikidata and the Dutch Wikipedia using Firefox.
May 16 2018
May 15 2018
I emailed Freenode. We should poke them if we haven't gotten confirmation by Thursday.
Note: Because we were a bit late with deploying this change, it didn't actually work and we hit the limit. @Reedy what was the limit again and where is this documented? I would like to update https://meta.wikimedia.org/wiki/Mass_account_creation#Requesting_temporary_lift_of_IP_cap to prevent other people from running into the same problem.
@Urbanecm with these kind of things I usually take it a bit broader. Feel free to enter what you think is a good number, but I would just do 250. I updated the task with the ip range based on the ipaddresses I got from @jcreus
May 11 2018
I cloned Pywikibot using $ git clone --recursive ssh://email@example.com:29418/pywikibot/core.git pywikibot