There is an upper limit to the size of a page. Archive pages can be too large for it. The value is available in the siteinfo API: 'maxarticlesize'.
https://www.mediawiki.org/wiki/Manual:$wgMaxArticleSize
https://www.mediawiki.org/wiki/Special:ApiSandbox#action=query&format=json&meta=siteinfo&siprop=general%7Cnamespaces%7Cnamespacealiases%7Cstatistics
When an archive page reaches maxarticlesize, the bot should stop trying to add content to it and find somewhere else to archive to.
For size-based archiving, it will be straightforward. You can just cap the max size parameter with maxarticlesize.
For time-based archiving, it does not seem so simple. One possibility might be to create a new archive page by adding a suffix (e.g. 2020 → 2020_(2)), which will make implementation a bit complicated, and the index page of archives will be messed up. There might be a better way to handle this.
Any thoughts?
Original discussion: https://commons.wikimedia.org/w/index.php?title=User_talk:ArchiverBot&oldid=436612913#Not_archiving_a_page