As a rPWBC scripts/category_redirect.py user I noted that this script is, apparently not {{bots|deny=all}}/{{nobots}} compliant. So the only I could do to avoid the bot from editting a specific page was to fully protect it (see example).
Description
Description
Details
Details
Subject | Repo | Branch | Lines +/- | |
---|---|---|---|---|
[bugfix] check for {{bots}}/{{nobots}} for original wiki text first | pywikibot/core | master | +49 -27 |
Status | Subtype | Assigned | Task | ||
---|---|---|---|---|---|
Resolved | Release | Xqt | T269823 Pywikibot release 5.2.0 dependencies | ||
Resolved | BUG REPORT | Xqt | T267770 wikibase_tests fails for TestLoadRevisionsCaching.test_page_text | ||
Resolved | BUG REPORT | Xqt | T262136 {{bots}}/{{nobots}} fails if the templates are overridden with new page text |
Event Timeline
Comment Actions
Change 639893 had a related patch set uploaded (by Xqt; owner: Xqt):
[pywikibot/core@master] [bugfix] Make category_redirect.py {{bots}}/{{nobots}} compliant
Comment Actions
This is a bigger problem. Page.save checks for BotMayEdit but unfortuntely inside the already changed text:
<<< @@ -1,2 +1 @@ + {{Categoría redirigida|Wikilibros:Políticas y orientaciones}} - #redirect [[:Categoría:Wikilibros:Políticas y orientaciones]] - {{bots|deny=all}} >>>
As you can see the the {{bot}} restriction was removed and is not taken into account then. The given patch solves the current problem but probably there should be a more general solution.
Comment Actions
Change 639893 merged by jenkins-bot:
[pywikibot/core@master] [bugfix] check for {{bots}}/{{nobots}} for original wiki text first