Before attempting the schema conversion on a live system, want to run the populateContentTables script against the test databases on db1111 and db1112.
The tricky bit is to figure out how to best run a script against a "non-standard" db. Also, before running the schema migration (stage 2), we'll have to create the tables (stage 0) and clean up the archive table (stage 1).
Then running the script, the first things we are interested in are
- does it run smoothly and
- how long does it take
- how long does it take to re-run it on an already converted db
If needed, we can reset the test by simply truncating the relevant tables (slots, content, content_models, and slot_roles).
Of course, we'll want to know whether the conversion was actually successful. That's not really easy to tell, but as a fist sanity check, I'd assert that
- the slots table and the content table have the same number of rows (this should be the case right after conversion).
- that number is the number of rows in the revision table + the number of rows in the archive table.
- slot_roles contains one row (for "main")
- content_models contains entries not just for wikitext, but also for JS and CSS pages.
- the content row for *.js and *.css pages in NS_MEDIAWIKI and NS_USER have the correct content model.