There should be an easy way to set up a basic, SQLite-backed MediaWiki instance, import a few representative articles from Wikipedia, and measure the backend response time to a set of representative user requests.
Description
Details
- Reference
- bz68821
Related Objects
Event Timeline
The idea here would be to have a benchmark script, like the ones we already have in MediaWiki's maintenance/benchmarks directory.
In addition to the few that are in there, it might be useful to adopt a practice of running these before major releases together with a re-run on the previous release for comparison (on the same hardware), and documenting the results.
@aaron also mentioned during our offsite that it might be interesting to go beyond just a simple maintenance script (which means installation, configuration and extensions has to be done manually, and may not be consistent between runs). And instead, to perhaps have it be a shell script that also performs these steps. For example, it could install MediaWiki into a temporary directory using SQLite, then install a number of extensions, then import a number of templates and articles, and then perform the various benchmarks.
For example, it could involve a macro benchmark for viewing a given article with object caches and parser cache cleared in-between. That might be enough for a starting point, additional scenarios could be added.