1 | Fetching template transclusions... |
---|---|
2 | Processing [[ve:Wikipedia:Community Portal]] |
3 | 125 Threads found on [[ve:Wikipedia:Community Portal]] |
4 | Looking for: {{User:MABot/config}} in [[ve:Wikipedia:Community Portal]] |
5 | Processing 125 threads |
6 | Archiving 108 thread(s). |
7 | Page [[Wikipedia:Community Portal/2010]] saved |
8 | Page [[Wikipedia:Community Portal/2011]] saved |
9 | Page [[Wikipedia:Community Portal/2012]] saved |
10 | Page [[Wikipedia:Community Portal/2013]] saved |
11 | Page [[Wikipedia:Community Portal/2014]] saved |
12 | Page [[Wikipedia:Community Portal/2015]] saved |
13 | WARNING: API error mwoauth-invalid-authorization-invalid-user: The authorization headers in your request are for a user that does not exist here |
14 | ERROR: Error occurred while processing page [[ve:Wikipedia:Community Portal]] |
15 | ERROR: NoUsername: Failed OAuth authentication for wikivoyage:en: The authorization headers in your request are for a user that does not exist here |
16 | Traceback (most recent call last): |
17 | File ".\scripts\archivebot.py", line 741, in main |
18 | archiver.run() |
19 | File ".\scripts\archivebot.py", line 632, in run |
20 | self.archives[a].update(comment) |
21 | File ".\scripts\archivebot.py", line 458, in update |
22 | self.save(summary) |
23 | File "F:\MWDEV\pywikibot-core\pywikibot\tools\__init__.py", line 1447, in wrapper |
24 | return obj(*__args, **__kw) |
25 | File "F:\MWDEV\pywikibot-core\pywikibot\page.py", line 1209, in save |
26 | cc=apply_cosmetic_changes, quiet=quiet, **kwargs) |
27 | File "F:\MWDEV\pywikibot-core\pywikibot\page.py", line 1217, in _save |
28 | summary = self._cosmetic_changes_hook(summary) or summary |
29 | File "F:\MWDEV\pywikibot-core\pywikibot\page.py", line 1267, in _cosmetic_changes_hook |
30 | self.text = ccToolkit.change(old) |
31 | File "F:\MWDEV\pywikibot-core\pywikibot\cosmetic_changes.py", line 271, in change |
32 | new_text = self._change(text) |
33 | File "F:\MWDEV\pywikibot-core\pywikibot\cosmetic_changes.py", line 265, in _change |
34 | text = self.safe_execute(method, text) |
35 | File "F:\MWDEV\pywikibot-core\pywikibot\cosmetic_changes.py", line 252, in safe_execute |
36 | result = method(text) |
37 | File "F:\MWDEV\pywikibot-core\pywikibot\cosmetic_changes.py", line 567, in cleanUpLinks |
38 | 'startspace']) |
39 | File "F:\MWDEV\pywikibot-core\pywikibot\textlib.py", line 376, in replaceExcept |
40 | replacement = new(match) |
41 | File "F:\MWDEV\pywikibot-core\pywikibot\cosmetic_changes.py", line 453, in handleOneLink |
42 | if not self.site.isInterwikiLink(titleWithSection): |
43 | File "F:\MWDEV\pywikibot-core\pywikibot\site.py", line 1135, in isInterwikiLink |
44 | linkfam, linkcode = pywikibot.Link(text, self).parse_site() |
45 | File "F:\MWDEV\pywikibot-core\pywikibot\page.py", line 5144, in parse_site |
46 | newsite = self._source.interwiki(prefix) |
47 | File "F:\MWDEV\pywikibot-core\pywikibot\site.py", line 949, in interwiki |
48 | return self._interwikimap[prefix].site |
49 | File "F:\MWDEV\pywikibot-core\pywikibot\site.py", line 705, in __getitem__ |
50 | raise self._iw_sites[prefix].site |
51 | File "F:\MWDEV\pywikibot-core\pywikibot\site.py", line 668, in site |
52 | self._site = pywikibot.Site(url=self.url) |
53 | File "F:\MWDEV\pywikibot-core\pywikibot\__init__.py", line 799, in Site |
54 | code = family.from_url(url) |
55 | File "F:\MWDEV\pywikibot-core\pywikibot\family.py", line 1217, in from_url |
56 | if path in site._interwiki_urls(): |
57 | File "F:\MWDEV\pywikibot-core\pywikibot\site.py", line 939, in _interwiki_urls |
58 | yield self.article_path |
59 | File "F:\MWDEV\pywikibot-core\pywikibot\site.py", line 2254, in article_path |
60 | assert self.siteinfo['general']['articlepath'].endswith('/$1'), \ |
61 | File "F:\MWDEV\pywikibot-core\pywikibot\site.py", line 1634, in __getitem__ |
62 | return self.get(key, False) # caches and doesn't force it |
63 | File "F:\MWDEV\pywikibot-core\pywikibot\site.py", line 1676, in get |
64 | preloaded = self._get_general(key, expiry) |
65 | File "F:\MWDEV\pywikibot-core\pywikibot\site.py", line 1622, in _get_general |
66 | default_info = self._get_siteinfo(props, expiry) |
67 | File "F:\MWDEV\pywikibot-core\pywikibot\site.py", line 1548, in _get_siteinfo |
68 | data = request.submit() |
69 | File "F:\MWDEV\pywikibot-core\pywikibot\data\api.py", line 2342, in submit |
70 | self._data = super(CachedRequest, self).submit() |
71 | File "F:\MWDEV\pywikibot-core\pywikibot\data\api.py", line 2175, in submit |
72 | % (self.site, info)) |
73 | pywikibot.exceptions.NoUsername: Failed OAuth authentication for wikivoyage:en: The authorization headers in your request are for a user that does not exist |
Not sure why the bot has to follow the interwikis on a page. In any case, I don't think it should as it's not needed for archiving. If the bot happens not to be registered on a project where an interwiki link is present, the archivebot will fail.