* What tools/bots already exist to address linkrot?
** [[ https://en.wikipedia.org/wiki/User:Cyberbot_II | Cyberbot II ]] on EN
** [[ https://es.wikipedia.org/wiki/User:Elvisor | Elvisor ]] on eswiki
** French WP also has a bot that creates an archive of all external links. We should look into how they're doing this.
* What archiving services could be used and what features/APIs do they offer?
** There's an API for pulling archives for the Wayback Machine: https://archive.org/help/wayback_api.php
* Existing contacts between the WMF and Internet Archive that we could pursue?
Alex (@Sadads) and others have been working with Internet Archive; they're very interested in helping.
Alex says: " We have made significant progress with [[ https://en.wikipedia.org/wiki/User:Cyberbot_II | en:User:Cyberbot II ]] adding links to archiveurls, but there needs to be a good technical way to store. Talked with Jdforrester (WMF) about building it into citoid at WikiCon USA. Internet archive was there, and expressed an interest in pushing their API's to the limit, to fix the 404 and other errors on Wikipedia."
Also investigate how an ecosystem of link-fixing could work--multiple approaches.