- setting up a script on francium to dump a single wiki to a specified directory (ns0 current revision only, but these should be configurable)
- setting up a script, also on francium, to loop through all wikis and dump them, doing cleanup of old files, generating pages with links for downloaders, etc.
- adding a cron job to automate the run
This task does not cover dumping of other namespaces other than the main ns, nor dealign with revision history. If/when desired, that shoudl be a new task.
T93396 Decide on format options for HTML and possibly other dumps
T93113 deploy francium for html/zim dumps
T97125 Determine service infra for HTML dumps
T17017 Wikimedia static HTML dumps broken (this task started out about the HTML dump extension but discussion veered off to dumps from Restbase)