- Set MediaWiki to write content meta-data to the old AND the new columns (via config[**]). Don't forget to also do this for new entries in the archive table.
- Wait a bit and watch for performance issues caused by writing to the new table.
- Run maintenance/populateContentTable.php to populate the content table. The script needs to support chunking (and maybe also sharding, for parallel operation).
- Keep watching for performance issues while the new table grows.
Operation of populateContentTable.php:
- Select n rows from the revision table that do not have a corresponding entry in the slots table (a WHERE NOT EXISTS subquery is probably better than a LEFT JOIN for this, because of LIMIT).
- For each such row, construct a corresponding row for the content and slots table[*][**]. The rows can either be collected in an array for later mass-insert, or inserted individually, possibly buffered in a transaction.
- The content_models, content_formats, and content_roles tables will be populated as a side-effect, by virtue of calling the assignId() function in order to get a numeric ID for content models, formats, and roles.
- When all rows in one chunk have been processed, insert/commit the new rows in the content table and wait for slaves to catch up.
- Repeat until there are no more rows in revision that have no corresponding row in content. This will eventually be the case, since web requests are already populating the content table when creating new rows in revision.
The same procedure can be applied to the archive table respectively.