Based in Nantes, France CET/CEST (UTC+1, UTC+2)
Main IRC channel is #wikimedia-releng
Any chance to have the deprecation notice to be either removed or fixed? :]
The Docker images for Quibble have the python3 which is a dependency of Quibble itself. There is no python2 package installed which would provide /usr/bin/python.
Well done @matmarex 🏅
As noted on b493d4e558368ab6fe9995051c6d9e2992df8bf2 :
I have deleted most of them. Some repositories are read-only / archived but still have php / content file, so potentially one might still be able to clone them from Gerrit and expect the code to work.
Maybe we can reenable the test in REL1_31 and apply the fix there? It is only for tests though so it is probably not needed.
Left to do is to rebase the mediawiki/skins change https://gerrit.wikimedia.org/r/c/mediawiki/skins/+/440106 :]
@Umherirrender seems like using binary has a database character set fix the test. Thus I am not sure whether the database is still needed, but then I havent looked at your change.
Lets do the analysis on our Phabricator task based on upstream questions. Then we can come back to them with a detailed bug report. At least at first glance there is no duplicate external id, but we have some emails associated to multiple accounts. That was already the case in ReviewDB as one can see by looking at the accounts_external_ids table. I guess the migration did not take that in account, and since Gerrit 2.15 no enforce uniqueness of emails address we get screwed up.
As @Smalyshev , the test ends up being logged out at some point. I am pretty sure that is the same issue we have encountered on T191537 which is that cookie from a different session were sent by a background job executed when logging in. That invalidate the session and causes the logged out page issue.
We have good reason to think that preventing jobs from sendinig cookie/headers is the definitive fix. https://gerrit.wikimedia.org/r/439289
Yup I got that Gerrit is now relying on NoteDB for the accounts data, but at least the database give us a snapshot of the state pre migration. I guess the conversion from ReviewDB to NoteDB has not been straightforward and cause a wild bunch of issues and errors.
$ ssh -p 29418 gerrit.wikimedia.org 'gerrit gsql' gerrit> select * from account_external_ids where email@example.com'; account_id | email_address | password | external_id -----------+-----------------------+----------+---------------- 59yy | firstname.lastname@example.org | NULL | gerrit:jbranaa 45xx | email@example.com | NULL | gerrit:jrbranaa
Note how the same email is known with two different ids.
Found the passphrase from integration-puppetmaster01 in the labs/private repo
# keyholder arm Enter passphrase for /etc/keyholder.d/cumin_openstack_integration_master: Identity added: /etc/keyholder.d/cumin_openstack_integration_master (/etc/keyholder.d/cumin_openstack_integration_master) root@integration-cumin:~# keyholder status keyholder-agent: active - 256 06:36:d8:17:14:ac:73:73:3b:71:ea:bf:1f:59:e1:23 /etc/keyholder.d/cumin_openstack_integration_master (ED25519) keyholder-proxy: active - 256 06:36:d8:17:14:ac:73:73:3b:71:ea:bf:1f:59:e1:23 /etc/keyholder.d/cumin_openstack_integration_master (ED25519)
$ keyholder status keyholder-agent: active - The agent has no identities. keyholder-proxy: active - The agent has no identities.
I have triggered the Quibble jobs against the dummy change https://gerrit.wikimedia.org/r/c/mediawiki/extensions/AntiSpoof/+/54376
--- a/dockerfiles/quibble-stretch/mariadb.cnf +++ b/dockerfiles/quibble-stretch/mariadb.cnf [mysqld] +character_set_server = binary +character_set_filesystem = binary +collation_server = binary
I triggered builds of the Math change https://gerrit.wikimedia.org/r/c/mediawiki/extensions/Math/+/436237 and it passed all CI jobs. So I am assuming it is definitely fixed.
@hashar Could you please take care of taking down https://github.com/wikimedia/operations-software-tessera when you can? Afterwards this task can be closed. Thanks.
A summary of our chat yesterday:
@Lucas_Werkmeister_WMDE dont worry that can wait tomorrow :-] Feel free to poke me on IRC tomorrow morning.
In CI, WikibaseQuality does not depend on Math:
$ colordiff -u Math WikibaseQuality --- Math 2018-06-12 18:54:57.569521197 +0200 +++ WikibaseQuality 2018-06-12 18:55:13.609284832 +0200 @@ -1,4 +1,3 @@ -mediawiki/extensions/Math mediawiki/extensions/BetaFeatures mediawiki/extensions/Capiunto mediawiki/extensions/CentralAuth
Or delay it to after the SRE offsite? It is not that urgent anyway :] That also let time for people to react on the announcement, specially volunteers who are usually active solely during the week-end.
Wikibase was failing the Selenium tests which are run by the 'quibble' job at least for the php7. It might be faster and thus ended up hitting the race condition.
I have crafted a dummy change for Wikibase which depends on the mediawiki/core patch: https://gerrit.wikimedia.org/r/c/mediawiki/extensions/Wikibase/+/439847 One build passed and a second is being run. On another Wikibase change the selenium test was failing constantly.
So I think it is a proper fix :]
We definitely should use polygerrit as the default. It still beta in 2.15 but that will prepare people to switch to it when later on there is no other choice.
Yesterday I have tried to use @runInSeparateProcess to write a test for headers/cookies, there are a few things that need to be fixed first but eventually it leads to an infinite loop. I have filled T193957