Try my attempt here: https://en.m.wikipedia.org/w/index.php?title=Wikipedia:Sandbox&oldid=884468638
btw another complaint on tech village pump on cswiki today: https://cs.wikipedia.org/wiki/Wikipedie:Pod_l%C3%ADpou_(technika)#Nadpisy_infobox%C5%AF_v_mobiln%C3%AD_verzi
At least the infobox caption should be styled the same way as infobox title is. Whether to hide it or style it differently is for some follow-up discussion
@bd808 https://wikitech.wikimedia.org/wiki/Help:Toolforge/Pywikibot is the best Toolforge manual from all Toolforge manuals. It just misses some stuff about how to create a tool and maybe a little more detail about how to run webservice (but it is not in the scope of the page)
@bd808 Well, this issue is from my point of view a little bit deep. To run Gerrit + Git review, there is nice tutorial with easy steps. On the other hand to gain acces to Toolforge, connect to it using ssh in terminal or file browser supporting ssh, run any type of command here, python commands specifically, create venv, maintain crontab, all these stuff I must've learn from my colleague as no tutorial on Wikitech was clear about what to do. I've already asked this somewhere (I can not find where).
Okay, we can currently do nothing about, just wait until Debian developers will update the package.
v0.7.11? Soooo close!
Well, we can use venv and run the script again with updated pip package. But I never used venv on forge, don't know how it works and the forge tutorials are not really good written.
You proposed to split site.py into a folder in https://gerrit.wikimedia.org/r/#/c/pywikibot/core/+/480503/. It can be a possible solution
tools.dvorapabot@tools-sgebastion-07:~$ python3 Python 3.5.3 (default, Sep 27 2018, 17:25:39) [GCC 6.3.0 20170516] on linux Type "help", "copyright", "credits" or "license" for more information. >>> import pymysql >>> pymysql.__version__ '0.7.10.None'
I tried the script multiple times and all the time the same result. If I limit the MySQL query (limit 100, limit 1000), it does not happen. Only if I run the whole (422 000 articles), it takes more than 30 hours to complete the task for every article on cswiki. I switched that easy query to site.allpages(), but we should definitely try to fix this for more complicated queries.
stretch, using crontab (I think jsub runs on gridengine?)
Have you tried to log into en wikipedia first?
I made Link, Page and also a callback function behave exactly the same, only string gets the data from the old link. I also made docs 100 % clear about this. I will slice the large patch into small portions
Meh such a small issue and we can not use IABot for Arkive links
Wed, Feb 20
Yeah, I miss this possibility too!
It seems like another solution for T124016, but good work at a first glance
This is normal, but stupid. For Link it gets the data from the Link, for Page and string it gets the data from the old link text. Weird, but fully described in the docs. I will try to find a better way in my patch.
Tue, Feb 19
This is similar to T104805 and probably it has the same origin in replace_links() working really bad
Mon, Feb 18
Any thoughts on this?
Wed, Feb 13
We could make a plan Pywikibot users could rely on. Like dropping 2.7.6 in 2 months, 2.7.9 in 4 months, some space for another potential deprecation, 2.7 in 8 months, 3.4 in 10 months (just an idea, not an actual proposal).
Tue, Feb 12
Mon, Feb 11
Also cosmetic-changes should fail silently and not interrupt the edit and freeze the scripts like replace.py, add_text.py etc...
Still one broken unhandled rare case
Also breaks on
== Heading ==<!--== Another heading == -->
Every second article today fails to save with cc on for me. My head explodes in a minute! :D
BTW I don't know if you know, but multiple toolforge users got all their cron jobs, grid jobs and webservices cancelled in 7th February with no warning on Trusty. So kmlexport and mapycz were down for cca 24 hours (and T215704 didn't help to me) and multiple cswiki maintenance robots are down since then too. Also without any fix from your webservices monitoring. That's why I decided to move my tools to Stretch directly the day after, because everything was down so why wait.
Is there a difference between 01 and 02 btw? (I always wondered)
Look into the /data/project/mapycz/restart.sh and /data/project/mapycz/webwatcher.py (the same for /data/project/kmlexport/*).
Sun, Feb 10
@bd808 Nope, they are not the same as jlocal works for me, just jsub and jstart does not
Sat, Feb 9
@Euku Until the patch is merged, you can use a temporary wikimediachapter family with wikimania language (pwb.py script -family:"wikimediachapter" -lang:"wikimania") solution
Fri, Feb 8
Tue, Feb 5
It seems Pywikibot can not handle site redirects well (be-x-old > be-tarask; cs:voy > incubator)
Both this and T113461 are high priority because the interwiki exists and on WIkipedia is fully functional, just Pywikibot does not accept it:
pywikibot.exceptions.SiteDefinitionError: voy:Main page is not a local page on wikipedia:cs, and the interwiki prefix voy is not supported by Pywikibot! Unknown URL 'https://cs.wikivoyage.org/wiki/$1'.
pywikibot.exceptions.SiteDefinitionError: be-tarask:Баўгарыя is not a local page on wikipedia:cs, and the interwiki prefix be-tarask is not supported by Pywikibot! Unknown URL 'https://be-x-old.wikipedia.org/wiki/$1'.
Similar error is thrown also for be-x-old/be-tarask Wikipedia: T113461
I think this should either fail on p=pywikibot.Page() or never fail at all. It should definitely not fail on p.exists() or p.isRedirectPage()