We already have this functionality (not as a maintenance script) but as a job spec/job[1] for WMF wikis. Third-parties will need a mechanism to warm their caches with parsoid output because we're now making several extensions and core to begin using parsoid outputs for views, edits etc.
In order for third-party wikis to not feel a performance degradation when they begin using the new backend, we should provide a maintenance script for them to run as a first step to prepare their caches with appropriate parser outputs from parsoid so when they switch to using the new backend, performance will stay the same as before (with the legacy output).
Ideally, the script should go through all pages on the set wiki progressively and parse pages using Parsoid, save the output in ParserCache (and the backend for PC can be configurable with https://gerrit.wikimedia.org/g/mediawiki/core/+/ab1a809acc6633fd7ebd2027688d51c4813754d1/docs/config-schema.yaml#2465). Also, the script should be stateful, so that if it gets aborted in a specific place/it fails in a specific place, it should resume and continue there.
== Options/Flags ==
* `--all` for running the parse on all pages in the wiki (if there is already an entry in PC, skip by default). When used with `--force`, force the parse.
* `--force` - force parse even if there is an entry in PC
* `--namespace X` - parse pages in a given namespace. Example: `--namespace MediaWiki:`
* `--start-from X` - the page ID to start the parse from
* More TBA.
[1] https://gerrit.wikimedia.org/r/c/mediawiki/core/+/806443