Just like Wikipedia, Wikidata faces the problem of link rot (https://en.wikipedia.org/wiki/Link_rot). A lot of the url's we're linking, for example as references, will stop working or have already stopped working. The first step is to make a backup of the contents of these linked pages.
Let's automatically save in the Internet Archive the external webpages linked from Wikidata items in order to prevent data loss.
The mechanism could consist in invoking https://web.archive.org/save/<URL> internally when a new value for a property with an external link (URL or external ID data types) is defined.
Implementation could be to update or fork the internet archive bot (source in php at https://github.com/cyberpower678/Cyberbot_II/tree/master/IABot ) or to write a minimal bot from scratch to bridge the gap between now and when the internet archive bot will support Wikidata.