When changing the structure of ParserOutput objects, or changing the serialization method, large parts of the ParserCache may be invalidated at once, leading triggering a cache stampede on a busy site. On Wikimedia sites, we can probably avoid this situation by transitioning to a new serialization format while still retaining compatibility for the old one, but this cannot be done for arbitrarily old cached versions of ParserOutput. This parties updating from an older revision of MediaWiki to the current one need a way to warm the cache during (ideally before?!) the update process.
The basic idea for this script is to iterate over a set of pages, and write (canonical) parser cache entries for these pages. We probably can't re-use existing cache entries, since the whole reason we run the script is that these have become unusable.
The set of pages to operate on could be given explicitly, or listed by namespace. Operation could also be restricted to pages that have been recently edited, or those that have been recently re-rendered. This would provide a heuristic for finding "busy" pages. We could also detect pages that have an entry in the parser cache, without loading them.
Note that it may turn out that what we need is pretty close to refreshLinks.php