Page MenuHomePhabricator

A lot of test fails due to UnicodeDecodeError with Python 2 at Appveyor and Travis
Closed, ResolvedPublic

Description

https://ci.appveyor.com/project/Ladsgroup/pywikibot-g4xqx/builds/30354124/job/4oxpjt9jkrw2di7t

All of them ends with:

    result = api.encode_url(query)
  File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 3190, in encode_url
    return urlencode(query)
  File "c:\python27-x64\Lib\urllib.py", line 1344, in urlencode
    v = quote_plus(str(v))
  File "c:\python27-x64\Lib\urllib.py", line 1307, in quote_plus
    return quote(s, safe)
  File "c:\python27-x64\Lib\urllib.py", line 1298, in quote
    if not s.rstrip(safe):
UnicodeDecodeError: 'ascii' codec can't decode byte 0xd1 in position 0: ordinal not in range(128)
======================================================================
ERROR: test_arabeyes (tests.site_detect_tests.NonStandardVersionSiteTestCase)
Test detection of MediaWiki sites for wiki.arabeyes.org.
----------------------------------------------------------------------

Seems urllib causes These problems

Related Objects

StatusSubtypeAssignedTask
OpenNone
ResolvedXqt
ResolvedXover
ResolvedXqt
ResolvedGoalXqt
ResolvedDvorapa
ResolvedLegoktm
OpenBUG REPORTNone
ResolvedBUG REPORTDvorapa
OpenNone
ResolvedBUG REPORTDvorapa
OpenNone
ResolvedXqt
ResolvedBUG REPORTDalba
ResolvedBUG REPORTDalba
ResolvedBUG REPORTXqt
ResolvedBUG REPORTXqt
ResolvedXqt
ResolvedXqt
ResolvedXqt
ResolvedBUG REPORTDalba
ResolvedXqt
ResolvedXqt
ResolvedBUG REPORTDalba
ResolvedDalba
OpenBUG REPORTNone
ResolvedBUG REPORTXqt
ResolvedBUG REPORTNone
ResolvedBUG REPORTDzahn
ResolvedBUG REPORTDalba
ResolvedXqt
ResolvedXqt
ResolvedXqt
ResolvedXqt
ResolvedXqt
ResolvedXqt
ResolvedXqt
OpenNone
ResolvedXqt
OpenNone
DeclinedNone
InvalidNone
ResolvedXqt
ResolvedXqt
ResolvedXqt
ResolvedXqt
ResolvedBUG REPORTXqt
ResolvedDvorapa
ResolvedDvorapa
OpenNone
OpenNone
ResolvedUrbanecm
OpenNone
ResolvedDvorapa
ResolvedBUG REPORTDvorapa
ResolvedDvorapa
OpenNone
OpenNone
ResolvedXqt
ResolvedXqt
OpenNone
ResolvedDvorapa
ResolvedXqt
ResolvedXqt
ResolvedXqt
ResolvedXqt
ResolvedDvorapa
ResolvedXqt
ResolvedDvorapa
ResolvedXqt
ResolvedDvorapa
DeclinedNone

Event Timeline

Restricted Application added subscribers: pywikibot-bugs-list, Aklapper. · View Herald Transcript
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1
  Downloading urllib3-1.25.8-py2.py3-none-any.whl (125 kB)

urllib3-1.25.8

Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in c:\projects\pywikibot-g4xqx\env\lib\site-packages (from requests>=2.20.1->-r requirements.txt (line 23)) (1.25.8)
Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in c:\projects\pywikibot-g4xqx\env\lib\site-packages (from requests>=2.20.1>pywikibot==3.0.20200125.dev0) (1.25.8)

The new urllib3 1.25.8 was published 21th January 2020

Change 567326 had a related patch set uploaded (by Xqt; owner: Xqt):
[pywikibot/core@master] [bugfix] prevent installing urllib3 == 1.25.8

https://gerrit.wikimedia.org/r/567326

Xqt renamed this task from A lot of test fails due to UnicodeDecodeError with Python2 to A lot of test fails due to UnicodeDecodeError with Python2 at Appveyor.Jan 26 2020, 1:20 PM

Change 567326 merged by jenkins-bot:
[pywikibot/core@master] [bugfix] prevent installing urllib3 == 1.25.8 with Python 2

https://gerrit.wikimedia.org/r/567326

This seems it is not an issue with urllib3==1.25.8 as here https://ci.appveyor.com/project/Ladsgroup/pywikibot-g4xqx/builds/30350615/job/fopgq04i4517x7a0?fullLog=true it wasn't failing even with 1.25.8 installed

It's urllib rather than urllib3:

[00:22:05] ======================================================================
[00:22:05] ERROR: test_url_encoding_from_basestring (tests.api_tests.TestUrlEncoding)
[00:22:05] Test encoding basestring values.
[00:22:05] ----------------------------------------------------------------------
[00:22:05] Traceback (most recent call last):
[00:22:05]   File "c:\projects\pywikibot-g4xqx\tests\api_tests.py", line 1120, in test_url_encoding_from_basestring
[00:22:05]     result = api.encode_url(query)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 3190, in encode_url
[00:22:05]     return urlencode(query)
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1344, in urlencode
[00:22:05]     v = quote_plus(str(v))
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1307, in quote_plus
[00:22:05]     return quote(s, safe)
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1298, in quote
[00:22:05]     if not s.rstrip(safe):
[00:22:05] UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 4: ordinal not in range(128)
[00:22:05] 
[00:22:05] ======================================================================
[00:22:05] ERROR: test_url_encoding_from_unicode (tests.api_tests.TestUrlEncoding)
[00:22:05] Test encoding unicode values.
[00:22:05] ----------------------------------------------------------------------
[00:22:05] Traceback (most recent call last):
[00:22:05]   File "c:\projects\pywikibot-g4xqx\tests\api_tests.py", line 1109, in test_url_encoding_from_unicode
[00:22:05]     result = api.encode_url(query)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 3190, in encode_url
[00:22:05]     return urlencode(query)
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1344, in urlencode
[00:22:05]     v = quote_plus(str(v))
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1307, in quote_plus
[00:22:05]     return quote(s, safe)
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1298, in quote
[00:22:05]     if not s.rstrip(safe):
[00:22:05] UnicodeDecodeError: 'ascii' codec can't decode byte 0xd1 in position 0: ordinal not in range(128)
[00:22:05] 
[00:22:05] ======================================================================
[00:22:05] ERROR: test_allpages_default (tests.pagegenerators_tests.TestFactoryGenerator)
[00:22:05] Test allpages generator.
[00:22:05] ----------------------------------------------------------------------
[00:22:05] Traceback (most recent call last):
[00:22:05]   File "c:\projects\pywikibot-g4xqx\tests\pagegenerators_tests.py", line 867, in test_allpages_default
[00:22:05]     pages = set(gen)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 2807, in __iter__
[00:22:05]     self.data = self.request.submit()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 1970, in submit
[00:22:05]     paramstring = self._http_param_string()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 1586, in _http_param_string
[00:22:05]     return encode_url(self._encoded_items())
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 3190, in encode_url
[00:22:05]     return urlencode(query)
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1344, in urlencode
[00:22:05]     v = quote_plus(str(v))
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1307, in quote_plus
[00:22:05]     return quote(s, safe)
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1298, in quote
[00:22:05]     if not s.rstrip(safe):
[00:22:05] UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 2: ordinal not in range(128)
[00:22:05] 
[00:22:05] ======================================================================
[00:22:05] ERROR: test_regexfilter_default (tests.pagegenerators_tests.TestFactoryGenerator)
[00:22:05] Test allpages generator with titleregex filter.
[00:22:05] ----------------------------------------------------------------------
[00:22:05] Traceback (most recent call last):
[00:22:05]   File "c:\projects\pywikibot-g4xqx\tests\pagegenerators_tests.py", line 894, in test_regexfilter_default
[00:22:05]     pages = list(gen)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\pagegenerators.py", line 1951, in titlefilter
[00:22:05]     for page in generator:
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 2807, in __iter__
[00:22:05]     self.data = self.request.submit()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 1970, in submit
[00:22:05]     paramstring = self._http_param_string()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 1586, in _http_param_string
[00:22:05]     return encode_url(self._encoded_items())
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 3190, in encode_url
[00:22:05]     return urlencode(query)
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1344, in urlencode
[00:22:05]     v = quote_plus(str(v))
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1307, in quote_plus
[00:22:05]     return quote(s, safe)
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1298, in quote
[00:22:05]     if not s.rstrip(safe):
[00:22:05] UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 2: ordinal not in range(128)
[00:22:05] 
[00:22:05] ======================================================================
[00:22:05] ERROR: test_regexfilternot_default (tests.pagegenerators_tests.TestFactoryGenerator)
[00:22:05] Test allpages generator with titleregexnot filter.
[00:22:05] ----------------------------------------------------------------------
[00:22:05] Traceback (most recent call last):
[00:22:05]   File "c:\projects\pywikibot-g4xqx\tests\pagegenerators_tests.py", line 934, in test_regexfilternot_default
[00:22:05]     pages = list(gen)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\pagegenerators.py", line 1951, in titlefilter
[00:22:05]     for page in generator:
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 2807, in __iter__
[00:22:05]     self.data = self.request.submit()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 1970, in submit
[00:22:05]     paramstring = self._http_param_string()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 1586, in _http_param_string
[00:22:05]     return encode_url(self._encoded_items())
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 3190, in encode_url
[00:22:05]     return urlencode(query)
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1344, in urlencode
[00:22:05]     v = quote_plus(str(v))
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1307, in quote_plus
[00:22:05]     return quote(s, safe)
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1298, in quote
[00:22:05]     if not s.rstrip(safe):
[00:22:05] UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 2: ordinal not in range(128)
[00:22:05] 
[00:22:05] ======================================================================
[00:22:05] ERROR: testFileTitle (tests.page_tests.TestPageObjectEnglish)
[00:22:05] Test title() method options in File namespace.
[00:22:05] ----------------------------------------------------------------------
[00:22:05] Traceback (most recent call last):
[00:22:05]   File "c:\projects\pywikibot-g4xqx\tests\page_tests.py", line 251, in testFileTitle
[00:22:05]     self.assertEqual(p2.title(as_url=True),
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\tools\__init__.py", line 1757, in wrapper
[00:22:05]     return obj(*__args, **__kw)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\page.py", line 373, in title
[00:22:05]     title = quote_from_bytes(encoded_title, safe='')
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1298, in quote
[00:22:05]     if not s.rstrip(safe):
[00:22:05] UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 11: ordinal not in range(128)
[00:22:05] 
[00:22:05] ======================================================================
[00:22:05] ERROR: test_archivebot_hu (tests.archivebot_tests.TestArchiveBot)
[00:22:05] Test archivebot for one site on wikipedia:hu
[00:22:05] ----------------------------------------------------------------------
[00:22:05] Traceback (most recent call last):
[00:22:05]   File "c:\projects\pywikibot-g4xqx\tests\aspects.py", line 748, in wrapped_method
[00:22:05]     func(self, key)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\tests\archivebot_tests.py", line 138, in test_archivebot
[00:22:05]     talk = archivebot.DiscussionPage(page, None)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\scripts\archivebot.py", line 427, in __init__
[00:22:05]     self.load_page()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\scripts\archivebot.py", line 445, in load_page
[00:22:05]     text = self.get()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\tools\__init__.py", line 1840, in wrapper
[00:22:05]     return obj(*new_args, **new_kwargs)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\tools\__init__.py", line 1757, in wrapper
[00:22:05]     return obj(*__args, **__kw)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\page.py", line 488, in get
[00:22:05]     self._getInternals()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\page.py", line 518, in _getInternals
[00:22:05]     self.site.loadrevisions(self, content=True)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\tools\__init__.py", line 1757, in wrapper
[00:22:05]     return obj(*__args, **__kw)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\site.py", line 4119, in loadrevisions
[00:22:05]     for pagedata in rvgen:
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 2983, in __iter__
[00:22:05]     for result in super(PropertyGenerator, self).__iter__():
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 2807, in __iter__
[00:22:05]     self.data = self.request.submit()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 2234, in submit
[00:22:05]     self._data = super(CachedRequest, self).submit()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 1970, in submit
[00:22:05]     paramstring = self._http_param_string()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 1586, in _http_param_string
[00:22:05]     return encode_url(self._encoded_items())
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 3190, in encode_url
[00:22:05]     return urlencode(query)
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1344, in urlencode
[00:22:05]     v = quote_plus(str(v))
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1307, in quote_plus
[00:22:05]     return quote(s, safe)
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1298, in quote
[00:22:05]     if not s.rstrip(safe):
[00:22:05] UnicodeDecodeError: 'ascii' codec can't decode byte 0xc5 in position 9: ordinal not in range(128)
[00:22:05] 
[00:22:05] ======================================================================
[00:22:05] ERROR: test_archivebot_ja (tests.archivebot_tests.TestArchiveBot)
[00:22:05] Test archivebot for one site on wikipedia:ja
[00:22:05] ----------------------------------------------------------------------
[00:22:05] Traceback (most recent call last):
[00:22:05]   File "c:\projects\pywikibot-g4xqx\tests\aspects.py", line 748, in wrapped_method
[00:22:05]     func(self, key)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\tests\archivebot_tests.py", line 138, in test_archivebot
[00:22:05]     talk = archivebot.DiscussionPage(page, None)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\scripts\archivebot.py", line 427, in __init__
[00:22:05]     self.load_page()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\scripts\archivebot.py", line 445, in load_page
[00:22:05]     text = self.get()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\tools\__init__.py", line 1840, in wrapper
[00:22:05]     return obj(*new_args, **new_kwargs)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\tools\__init__.py", line 1757, in wrapper
[00:22:05]     return obj(*__args, **__kw)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\page.py", line 488, in get
[00:22:05]     self._getInternals()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\page.py", line 518, in _getInternals
[00:22:05]     self.site.loadrevisions(self, content=True)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\tools\__init__.py", line 1757, in wrapper
[00:22:05]     return obj(*__args, **__kw)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\site.py", line 4119, in loadrevisions
[00:22:05]     for pagedata in rvgen:
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 2983, in __iter__
[00:22:05]     for result in super(PropertyGenerator, self).__iter__():
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 2807, in __iter__
[00:22:05]     self.data = self.request.submit()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 2234, in submit
[00:22:05]     self._data = super(CachedRequest, self).submit()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 1970, in submit
[00:22:05]     paramstring = self._http_param_string()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 1586, in _http_param_string
[00:22:05]     return encode_url(self._encoded_items())
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 3190, in encode_url
[00:22:05]     return urlencode(query)
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1344, in urlencode
[00:22:05]     v = quote_plus(str(v))
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1307, in quote_plus
[00:22:05]     return quote(s, safe)
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1298, in quote
[00:22:05]     if not s.rstrip(safe):
[00:22:05] UnicodeDecodeError: 'ascii' codec can't decode byte 0xe5 in position 0: ordinal not in range(128)
[00:22:05] 
[00:22:05] ======================================================================
[00:22:05] ERROR: test_archivebot_sv (tests.archivebot_tests.TestArchiveBot)
[00:22:05] Test archivebot for one site on wikipedia:sv
[00:22:05] ----------------------------------------------------------------------
[00:22:05] Traceback (most recent call last):
[00:22:05]   File "c:\projects\pywikibot-g4xqx\tests\aspects.py", line 748, in wrapped_method
[00:22:05]     func(self, key)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\tests\archivebot_tests.py", line 138, in test_archivebot
[00:22:05]     talk = archivebot.DiscussionPage(page, None)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\scripts\archivebot.py", line 427, in __init__
[00:22:05]     self.load_page()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\scripts\archivebot.py", line 445, in load_page
[00:22:05]     text = self.get()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\tools\__init__.py", line 1840, in wrapper
[00:22:05]     return obj(*new_args, **new_kwargs)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\tools\__init__.py", line 1757, in wrapper
[00:22:05]     return obj(*__args, **__kw)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\page.py", line 488, in get
[00:22:05]     self._getInternals()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\page.py", line 518, in _getInternals
[00:22:05]     self.site.loadrevisions(self, content=True)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\tools\__init__.py", line 1757, in wrapper
[00:22:05]     return obj(*__args, **__kw)
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\site.py", line 4119, in loadrevisions
[00:22:05]     for pagedata in rvgen:
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 2983, in __iter__
[00:22:05]     for result in super(PropertyGenerator, self).__iter__():
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 2807, in __iter__
[00:22:05]     self.data = self.request.submit()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 2234, in submit
[00:22:05]     self._data = super(CachedRequest, self).submit()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 1970, in submit
[00:22:05]     paramstring = self._http_param_string()
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 1586, in _http_param_string
[00:22:05]     return encode_url(self._encoded_items())
[00:22:05]   File "c:\projects\pywikibot-g4xqx\pywikibot\data\api.py", line 3190, in encode_url
[00:22:05]     return urlencode(query)
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1344, in urlencode
[00:22:05]     v = quote_plus(str(v))
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1307, in quote_plus
[00:22:05]     return quote(s, safe)
[00:22:05]   File "c:\python27\Lib\urllib.py", line 1298, in quote
[00:22:05]     if not s.rstrip(safe):
[00:22:05] UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 3: ordinal not in range(128)
[00:22:05]

And this fails only in setup.py test, not in setup.py pytest

Have a look at test_url_encoding_from_basestring: The code (in api_tests and api as well as in Python 2.7.4) hasn't been changed for 5 years. Probably it was never tested or not tested for a Long time before rPWBCff6c72e. Does it work on Travis? See T121318.

This comment was removed by Dvorapa.
This comment was removed by Dvorapa.

I don't know, it does not seem to be this:

Ran 1710 tests in 1518.034s
FAILED (failures=1, errors=1, skipped=109, expected failures=14, unexpected successes=1)
Ran 1710 tests in 2255.929s
FAILED (failures=1, errors=10, skipped=109, expected failures=14, unexpected successes=1)

There seems to be no differece even in run tests

This comment was removed by Dvorapa.

Have a look at test_url_encoding_from_basestring: The code (in api_tests and api as well as in Python 2.7.4) hasn't been changed for 5 years. Probably it was never tested or not tested for a Long time before rPWBCff6c72e. Does it work on Travis? See T121318.

This seems to be a bug in urllib: urllib tries all the possible way to make query bytes (str) and finally it does s.rstrip('/') on it. This works quite good, until we use from __future__ import unicode_literals in our code. This makes s.rstrip(u'/'), but s is bytes, which produces the error. Either we should disable unicode_literals for urllib somehow or use other option than urllib or just ignore this and deprecate Python 2 ASAP

I don't know, why this is failing. Why is urllib using unicode_literals, that Pywikibot imports? This is some sort of mystery. It seems there is some issue with Python2 cache, but why?

https://github.com/jxtech/wechatpy/issues/375 gives a hint probably

I am fine with skipping theses Tests and giving a warning that Python 2 is not functional in some parts and may fail. Therefore the support for it will be dropped.
But I am not interested to spend a lot of time to this outdated Python version and also do not want to have Python 2 installed for debugging stuff on my development environment. Looking Forward and not behind: Python 3/4 is the way.

Maybe we can start a new task to collect all Python 2 related bugs, decrease their priority and review any patch if someone is able and willing to fix some of them. But let them fail expectedly if they fail.

Change 567457 had a related patch set uploaded (by Xqt; owner: Xqt):
[pywikibot/core@master] [bugfix] preload urllib.quote() with Python 2

https://gerrit.wikimedia.org/r/567457

Change 567457 merged by jenkins-bot:
[pywikibot/core@master] [bugfix] preload urllib.quote() with Python 2

https://gerrit.wikimedia.org/r/567457

Dvorapa renamed this task from A lot of test fails due to UnicodeDecodeError with Python2 at Appveyor to A lot of test fails due to UnicodeDecodeError with Python2 at Appveyor and Travis.Jan 27 2020, 2:57 PM
Dvorapa renamed this task from A lot of test fails due to UnicodeDecodeError with Python2 at Appveyor and Travis to A lot of test fails due to UnicodeDecodeError with Python 2 at Appveyor and Travis.Jan 27 2020, 3:00 PM
Dvorapa assigned this task to Xqt.