Thanks to work done in T50217: Speed up MediaWiki PHPUnit build by running integration tests in parallel, T361118: [EPIC][Infra] Reduce CI test runtime for Wikibase and related extensions and other tasks, we now are able to run PHPUnit tests in parallel via Quibble in CI.
This has led to a significant improvement in build times, going from ~30 to ~10 minutes for the wmf-quibble-vendor-mysql-php74 job.
We should now look at enabling PHPUnit parallel execution for the mediawiki/core repo. Those still clock in regularly around the ~30 minute mark.
Enabling is straightforward:
if ( # ... temporarily exclude MediaWiki core and extensions # that have issues with parallel tests params["ZUUL_PROJECT"] not in [ "mediawiki/core", "mediawiki/extensions/WikiLambda", # DonationInterface uses a different branching model. Its master # branch is tested with mediawiki/core fundraising/REL1_39 branch # which does not have the parallel work. "mediawiki/extensions/DonationInterface", ] # ... exclude on REL_ branches (not yet tested/patched), and "ZUUL_BRANCH" in params and not params["ZUUL_BRANCH"].startswith("REL1") # Exclude fundraising branches and specific jobs and not params["ZUUL_BRANCH"].startswith("fundraising") and not job.name.startswith("quibble-fundraising") ): params['QUIBBLE_PHPUNIT_PARALLEL'] = '1'
and involves removing mediawiki/core from the list above in the integration/config repo. There will likely be some breakages when this is done, though, so we'd need to work out how to do this in a way that balances disruption and level of effort to avoid disruption. e.g. one approach might be to pick a Friday morning to remove mediawiki/core from the list, run some change through CI, gather the list of tests that break, then add back mediawiki/core to the config until we have time to update the tests.