Page MenuHomePhabricator

Provide a warning message instead of a huge traceback for timeout error
Open, MediumPublic

Description

using delete.py I got this huge traceback when the bot failed to delete a page due to a timeout error:

>>> Page for deletion <<<
WARNING: There are 9 pages that link to [[de:Page for deletion]].
    Main:    [0]          7 pages
    Talk:    [1]          1 page
    User:    [2]          1 page
Sleeping for 3.5 seconds, 2018-12-15 15:43:52
ERROR: Traceback (most recent call last):
  File "C:\python37\lib\site-packages\urllib3\connectionpool.py", line 384, in _make_request
    six.raise_from(e, None)
  File "<string>", line 2, in raise_from
  File "C:\python37\lib\site-packages\urllib3\connectionpool.py", line 380, in _make_request
    httplib_response = conn.getresponse()
  File "C:\python37\lib\http\client.py", line 1321, in getresponse
    response.begin()
  File "C:\python37\lib\http\client.py", line 296, in begin
    version, status, reason = self._read_status()
  File "C:\python37\lib\http\client.py", line 257, in _read_status
    line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
  File "C:\python37\lib\socket.py", line 589, in readinto
    return self._sock.recv_into(b)
  File "C:\python37\lib\ssl.py", line 1049, in recv_into
    return self.read(nbytes, buffer)
  File "C:\python37\lib\ssl.py", line 908, in read
    return self._sslobj.read(len, buffer)
socket.timeout: The read operation timed out

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\python37\lib\site-packages\requests\adapters.py", line 445, in send
    timeout=timeout
  File "C:\python37\lib\site-packages\urllib3\connectionpool.py", line 638, in urlopen
    _stacktrace=sys.exc_info()[2])
  File "C:\python37\lib\site-packages\urllib3\util\retry.py", line 367, in increment
    raise six.reraise(type(error), error, _stacktrace)
  File "C:\python37\lib\site-packages\urllib3\packages\six.py", line 686, in reraise
    raise value
  File "C:\python37\lib\site-packages\urllib3\connectionpool.py", line 600, in urlopen
    chunked=chunked)
  File "C:\python37\lib\site-packages\urllib3\connectionpool.py", line 386, in _make_request
    self._raise_timeout(err=e, url=url, timeout_value=read_timeout)
  File "C:\python37\lib\site-packages\urllib3\connectionpool.py", line 306, in _raise_timeout
    raise ReadTimeoutError(self, url, "Read timed out. (read timeout=%s)" % timeout_value)
urllib3.exceptions.ReadTimeoutError: HTTPSConnectionPool(host='de.wikipedia.org', port=443): Read timed out. (readtimeout=45)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\pwb\GIT\core\pywikibot\data\api.py", line 1935, in _http_request
    body=body, headers=headers)
  File "C:\pwb\GIT\core\pywikibot\tools\__init__.py", line 1739, in wrapper
    return obj(*__args, **__kw)
  File "C:\pwb\GIT\core\pywikibot\comms\http.py", line 327, in request
    r = fetch(baseuri, method, params, body, headers, **kwargs)
  File "C:\pwb\GIT\core\pywikibot\comms\http.py", line 530, in fetch
    error_handling_callback(request)
  File "C:\pwb\GIT\core\pywikibot\comms\http.py", line 415, in error_handling_callback
    raise request.data
  File "C:\pwb\GIT\core\pywikibot\comms\http.py", line 394, in _http_process
    **http_request.kwargs)
  File "C:\python37\lib\site-packages\requests\sessions.py", line 512, in request
    resp = self.send(prep, **send_kwargs)
  File "C:\python37\lib\site-packages\requests\sessions.py", line 622, in send
    r = adapter.send(request, **kwargs)
  File "C:\python37\lib\site-packages\requests\adapters.py", line 526, in send
    raise ReadTimeout(e, request=request)
requests.exceptions.ReadTimeout: HTTPSConnectionPool(host='de.wikipedia.org', port=443): Read timed out. (read timeout=45)

WARNING: Waiting 5 seconds before retrying.
WARNING: API error missingtitle: The page you specified doesn't exist.

92 pages read
0 pages written
Execution time: 548 seconds
Read operation time: 5 seconds
Script terminated by exception:

ERROR: APIError: missingtitle: The page you specified doesn't exist. [help:See https://de.wikipedia.org/w/api.php for API usage. Subscribe to the mediawiki-api-announce mailing list at &lt;https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce&gt; for notice of API deprecations and breaking changes.]
Traceback (most recent call last):
  File "C:\pwb\GIT\core\pwb.py", line 257, in <module>
    if not main():
  File "C:\pwb\GIT\core\pwb.py", line 250, in main
    run_python_file(filename, [filename] + args, argvu, file_package)
  File "C:\pwb\GIT\core\pwb.py", line 119, in run_python_file
    main_mod.__dict__)
  File ".\scripts\delete.py", line 308, in <module>
    main()
  File ".\scripts\delete.py", line 300, in main
    bot.run()
  File "C:\pwb\GIT\core\pywikibot\bot.py", line 1702, in run
    super(MultipleSitesBot, self).run()
  File "C:\pwb\GIT\core\pywikibot\bot.py", line 1505, in run
    self.treat(page)
  File "C:\pwb\GIT\core\pywikibot\bot.py", line 1732, in treat
    self.treat_page()
  File ".\scripts\delete.py", line 220, in treat_page
    quit=True)
  File "C:\pwb\GIT\core\pywikibot\tools\__init__.py", line 1739, in wrapper
    return obj(*__args, **__kw)
  File "C:\pwb\GIT\core\pywikibot\page.py", line 1914, in delete
    return self.site.deletepage(self, reason)
  File "C:\pwb\GIT\core\pywikibot\site.py", line 1323, in callee
    return fn(self, *args, **kwargs)
  File "C:\pwb\GIT\core\pywikibot\tools\__init__.py", line 1739, in wrapper
    return obj(*__args, **__kw)
  File "C:\pwb\GIT\core\pywikibot\site.py", line 5627, in deletepage
    req.submit()
  File "C:\pwb\GIT\core\pywikibot\data\api.py", line 2270, in submit
    raise APIError(**result['error'])
pywikibot.data.api.APIError: missingtitle: The page you specified doesn't exist.
 [help:See https://de.wikipedia.org/w/api.php for API usage. Subscribe to the mediawiki-api-announce mailing list at &lt;https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce&gt; for notice of API deprecations and breaking changes.]
<class 'pywikibot.data.api.APIError'>
CRITICAL: Closing network session.

C:\pwb\GIT\core>

Seems the last error was shown when the bot continued working and the page was deleted already but I am not sure.