Problem:
We need tests that ensure we don't break the Wikimedia setup, including the various extensions, backend services, and cross-wiki interactions we have.
Context:
The suite of api integration tests included with MediaWiki core is designed to run locally in a development environment (docker-compose), and against proposed changes during code review (CI/Jenkins). They are written to work against a vanilla configuration (with DevelopmentSessings enabled).
Things that cannot be tested against a vanilla setup include:
- language links (because no language interwiki prefixes are defined)
- remote image repo
- cirrus search
- change propagation
- wikidata includes
- authorization via oauth
- etc
Solution:
Create a suite of tests designed to run against the beta cluster on labs. Run them periodically (perhaps every time we pull the latest version of the code, which is every ten minutes or so).
This test suite should live in it's own repo. It's Wikimedia-Specific.
Complication:
- In order to run these test, we have to share at least one secret ($wgSecret) between the test runner's environment and the wiki.
- While developing tests, it would be useful to be able to run them from the local machine, against the remote testing environment. How would sharing the secret work in this scenario? Something like a temporary bot password might work, but how would the developer acquire one?
- Tests must be written defensively in a way that allows them to reliably function while other users (humans, selenium tests, etc) interact with the same wiki instances. For the most part, this can be achieved by randomizing page titles and user names.
- It's unclear who would receive notification of failures. Ideally, the authors (teams?) of patches merged after the tests were last green would receive them. This may not be trivial to do, though.