Page MenuHomePhabricator

Jenkins: Set up PHPUnit testing on PostgreSQL backend
Closed, ResolvedPublic

Assigned To
Authored By
saper
Jun 14 2012, 6:00 PM
Referenced Files
F9371: mwtest
Nov 22 2014, 12:24 AM
Tokens
"Party Time" token, awarded by Jdforrester-WMF."Like" token, awarded by Physikerwelt."Orange Medal" token, awarded by Krinkle."Like" token, awarded by saper."Like" token, awarded by RandomDSdevel.

Description

As of c15d0a7521231c2cb71e664265e08d0ae514fc73 we have now 3 unit tests failing and 1 error due to lack of upgrade with PostgreSQL. This could be avoided if we were having PostgreSQL unit tests running by default as we've had for SVN.

I have filed so far (should there be dependency? I don't think so)

https://bugzilla.wikimedia.org/show_bug.cgi?id=36759 WikiPageTest::testDoDeleteArticle fails on PostgreSQL

https://bugzilla.wikimedia.org/show_bug.cgi?id=37600
WikiPageTest::testDoDeleteUpdates fails on PostgreSQL

https://bugzilla.wikimedia.org/show_bug.cgi?id=37601
Tests introduced in tests/phpunit/includes/db/TestORMRowTest.php fail on PostgreSQL


Version: unspecified
Severity: enhancement

Related Objects

Event Timeline

bzimport raised the priority of this task from to Low.Nov 22 2014, 12:24 AM
bzimport set Reference to bz37602.
bzimport added a subscriber: Unknown Object (MLST).

What is needed, what is missing to solve this? I see a disabled project MediaWiki-postgres-phpunit on Jenkins. Can it be enabled, but not connected to Gerrit until this is fixed?

(In reply to comment #1)

What is needed, what is missing to solve this? I see a disabled project
MediaWiki-postgres-phpunit on Jenkins. Can it be enabled, but not connected to
Gerrit until this is fixed?

That test was connected to SVN. So there's no way to really enable it without connecting it to gerrit to begin with.

(In reply to comment #2)

What is needed, what is missing to solve this? I see a disabled project
MediaWiki-postgres-phpunit on Jenkins. Can it be enabled, but not connected to
Gerrit until this is fixed?

That test was connected to SVN. So there's no way to really enable it without
connecting it to gerrit to begin with.

Sure; I meant the feedback to code review. The shared build.xml looks like it can build a PostgreSQL database, so what is needed to switch off the "-2"s to Gerrit?

How is the interchange between Gerrit and Jenkins handled anyway? There seem to be two plugins: Gerrit and Gerrit Trigger. http://www.mediawiki.org/wiki/Continuous_integration/Workflow suggests that we use the former, various config.xml the latter. Neither integration/jenkins.git nor operations/puppet.git seem to have code to actually install or configure a plugin. Is this done manually? The latter repository has files/gerrit/hooks/patchset-created, but it does not seem to trigger Jenkins.

After further reading: If Gerrit Trigger is used, "<silentMode>true</silentMode>" seems to be what is needed:

Sets silent mode to on or off.
When silent mode is on there will be no communication back to Gerrit,
i.e. no build started/failed/successful approve messages etc.
If other non-silent jobs are triggered by the same Gerrit event as this job,
the result of this job's build will not be counted in the end result of the other jobs.

Lowering priority, this is definitely less important than having the integration test to run against mysql :-]

Setting to "normal", since last time we've heard there were tests running only from SVN, not from git. Why can't we have them back?

I'd prefer to have a broken feature restored then some new stuff to be implemented.

Sorry to hear that. I have some trouble catching up with ppl merging things like LOCK IN SHARE MODE (e.g. bug 46594) and breaking other databases.

We have a postgre database installed on the Jenkins machine but I haven't found the time to write all the glue needed to make it possible. Namely:

  • write a script to create a user/database in postgre
  • another script to cleanup posgre after
  • pass credentials to the mediawiki cli installer, an example for sqlite is in integration/jenkins.git as bin/mw-install-sqlite.sh
  • write the Jenkins job templates that would invoke above scripts
  • triggers said job in Jenkins using Zuul (easy)

The nasty thing is making sure the database/user is cleaned out at the end of the job.

hashar, can you point me to relevant portions of git tree where those script should be located (like the ones you have for MySQL)?

Is there any environment in labs I could test it with?

I hope to get PostgreSQL tests back in shape again soon so it would be super cool to have those things run by Jenkins to surprise unsuspecting developers.

Marcin, sorry for the delay.

The shell scripts run by Jenkins are hosted in integration/jenkins.git under the /bin repository.

I have created a while back basic placeholders:

mw-install-mysql.sh@ -> not-implemented.sh
mw-install-postgre.sh@ -> not-implemented.sh
mw-install-postgresql.sh@ -> not-implemented.sh

The one used by sqlite is mw-install-sqlite.sh. Maybe it can be made a generic shell script that would take as argument the type of database to set (or lookup for the database name from the $0 variable).

How can the mw-install-mysql.sh be unimplemented?

Is there any point in working on this now, or will phabricator migration invalidate any work done?

From which repository is PostgreSQL installed on this server?

thanks,

Jeff

(In reply to Jeff Janes from comment #13)

How can the mw-install-mysql.sh be unimplemented?

[...]

There are no tests run for MySQL at the moment (cf. bug #35912).

(In reply to Marcin Cieślak from comment #11)

[...]

Is there any environment in labs I could test it with?

[...]

hashar, that would be very interesting to know indeed. I have a test script on my local machine that sets up and tears down a PostgreSQL database (and a MySQL database, in fact, depending on an option), but "guessing" the environment it is run in doesn't seem like a sensible approach :-).

Someone need to figure out a way to:

  • create a unique username / database when a job start
  • inject that in LocalSettings.php or have the dbname / credentials generated in a predictive way
  • delete the database / remove credential

Then, add a whole lot of jobs that triggers against mysql and postgre. That never has been a priority though and I am definitely not working on it.

Created attachment 16741
My test script for reference.

I use the attached script.

The user Jenkins runs the test as has to be a PostgreSQL superuser that can create other users and databases, for example authenticated by password or identd.

You would probably set TESTID to the Jenkins job number if that is unique. Then:

psql -c "CREATE USER \"$TESTID\" WITH PASSWORD 'abc';" template1
psql -c "CREATE DATABASE \"$TESTID\" WITH OWNER \"$TESTID\";" template1
php ./maintenance/install.php --dbtype=postgres \
--dbport 5432 \
--dbuser "$TESTID" \
--dbpass abc \
--dbname "$TESTID" \
--pass testpass \
--server http://localhost \
--scriptpath /$SOMETHING_MEANINGFUL \
postgresqltest WikiAdmin

should create the user, database and install MediaWiki there (assuming standard ports; my development server runs on port 5433 and thus my script uses that). I haven't looked at how MediaWiki is installed in the SQLite tests, but keeping them in sync makes sense.

If passwords need to be random, you can probably use something like:

TESTPW="$(base64 /dev/urandomhead -c8)"

and then replace 'abc' with '"$TESTPW"'.

To remove the database and user:

psql -c "DROP DATABASE \"$TESTID\";" template1;
psql -c "DROP USER \"$TESTID\";" template1;

(If TESTID doesn't contain dashes ("-") or other SQL syntax, you could probably get away without the quotes, but they won't hurt.)

Attached:

Is there a need to create a new user for each test? I would think that it is sufficient to create a new database, and use the same user for each one. And the create database would then probably not be necessary, as the install.php does it for you.

I think the main concern would be what happens if the test fails in such a way that it doesn't get to the clean-up step.

If $TESTID are aggressively recycled, then it should be sufficient just to do do a preemptive drop of the database at the beginning of each test run, so if any database was left over from a previously uncleanly terminated test, it would get dropped. But if the $TESTID are not aggressively recycled (say, they just increment up to 2^32-1 and then restart), that could leak an enormous amount of disk space.

I haven't been able to figure out how $TESTID is chosen.

If it is necessary to have a clean up similar to the sqlite one, where it just drops any databases that haven't been used in a certain amount of time, that would be much harder.

I create a separate user and database for each test in my setup because I have some diffuse recollections of not every DB object identifier being schema-qualified in some places in the past; having isolated sandboxes minimizes the risk of intermingling involved. As users and databases are cheap, I use them very generously.

In my setup, I clean up test databases manually and $TESTID contains the SHA1 of the commit being tested and as a suffix the commit that an update is tested from (i. e. $SHA1...-HEAD = a fresh installation of $SHA1 is tested; $SHA1-1.19.21 = 1.19.21 is freshly installed, then $SHA1 is checked out, maintenance/update.php is called and then the tests are run).

With Jenkins, you would probably use the Jenkins job number which is strictly increasing to name the user/database, so for clean-up you would just look at a job that is x days old and drop all users/databases that have a lesser job number. Or you could iterate over all databases, look up when the corresponding Jenkins job finished, if that is > x days, drop.

(I think the SQLite way is just to tar up the build directory; that would be equivalent to pg_dump the database to the build directory, drop the database and then tar up. So no leakage while preserving all data. A bzipped2 pg_dump of the database after the tests were run is about 100 kByte on my machine.)

Change 171879 had a related patch set uploaded by Tim Landscheidt:
Enable Travis CI for PostgreSQL

https://gerrit.wikimedia.org/r/171879

Change 171879 merged by jenkins-bot:
Enable Travis CI for PostgreSQL

https://gerrit.wikimedia.org/r/171879

Tim, thanks for this! This is very useful.

I hope we can start to test the Jenkins setup someday.

hashar lowered the priority of this task from Low to Lowest.Nov 24 2014, 11:52 AM
hashar set Security to None.

@hashar, is there any way I can setup Jenkins for myself in labs and attach it to gerrit?

We should setup this.

To prevent regressions such as T147599

And travis ci is un reliable since all we want the tests to do is pass on Wikimedia ci.

Change 316228 had a related patch set uploaded (by Paladox):
Install postgresql on ci in php.pp

https://gerrit.wikimedia.org/r/316228

Change 316230 had a related patch set uploaded (by Paladox):
Add mw-teardown-postgresql for postgres

https://gerrit.wikimedia.org/r/316230

Change 316232 had a related patch set uploaded (by Paladox):
Update mw-install-postgresql to include the install script

https://gerrit.wikimedia.org/r/316232

Change 316475 had a related patch set uploaded (by Paladox):
Install postgresql on ci-image-jessie

https://gerrit.wikimedia.org/r/316475

Jdforrester-WMF added a subscriber: Jdforrester-WMF.

Migrating from the old tracking task to a tag for PostgreSQL-related tasks.

Change 316228 merged by Dzahn:
contint: add postgresql to contint::packages::postgresql

https://gerrit.wikimedia.org/r/316228

Does anyone know what should be done with this open Gerrit change? https://gerrit.wikimedia.org/r/#/c/316230/ It's been almost a year ago...

Change 316230 abandoned by Dzahn:
Add mw-teardown-postgresql for postgres

https://gerrit.wikimedia.org/r/316230

Change 316475 abandoned by Hashar:
Install postgresql on ci-image-jessie

Reason:
Nodepool is legacy. Maybe Postgre support will be added one day in the Docker container we are going to build to run MediaWiki tests.

https://gerrit.wikimedia.org/r/316475

Legoktm claimed this task.
Legoktm added a subscriber: Legoktm.

CI now supports running MediaWiki tests on the postgres backend. https://integration.wikimedia.org/ci/job/quibble-vendor-postgres-php70-docker/

The main commit implementing support was https://gerrit.wikimedia.org/r/430035

I filed T195807: Fix failing MediaWiki core tests on Postgres database backend to address the current test failures before we can make the job voting.

Change 316232 abandoned by Paladox:
Update mw-install-postgresql to include the install script

https://gerrit.wikimedia.org/r/316232