In the past when we undeployed extensions or services that stored large amounts of content we created a static HTML dump of that content first. For example:
- {T336320}
- {T243056}
- static-bugzilla.wikimedia.org
Despite my best efforts (okay maybe I could have done better, but still ...), it appears that exporting content from Flow to wikitext is inherently a lossy process which may fail to properly transfer all of the content included in the Flow board due to Parsoid HTML edge cases, wikitext formatting issues, etc. (i.e. T388687, https://www.mediawiki.org/wiki/User_talk:Pppery#Flow_cleanup_bot_(posting_here_to_avoid_accidentally_cluttering_the_thread_at_the_Village_Pump), probably some others).
Technically no content would be lost because Flow has good dump coverage, but still ...
Additionally there's no way to preserve links to individual topics on third-party websites without doing this (or telling the Wayback Machine do it as I originally suggested in T332022#10639702 before realizing this would be better.
Thus, we should consider doing the same thing we already did for Bugzilla, CodeReview, RT, etc., and screen-scrape all hundred-thousand-ish flow topics, stash the HTML somewhere, and set up a redirect.
This is in addition to the wikitext exports which are being done in Phase 2 of the undeploy, to catch any edge cases they missed, not instead of them.