Given the huge amount of URLs from a variety of websites, it's a good idea to introduce a bot that can provide archive URLs. This may require many works.
Description
Description
Status | Subtype | Assigned | Task | ||
---|---|---|---|---|---|
Resolved | Cyberpower678 | T120433 Migrate dead external links to archives | |||
Resolved | None | T136130 Add IABot support for other wikis (tracking) | |||
Resolved | Cyberpower678 | T143488 Save contents of URLs linked from Wikidata in the Internet Archive | |||
Resolved | Cyberpower678 | T187611 Adapt InternetArchiveBot to Wikidata |
Event Timeline
Comment Actions
For information: this should be done per property, as some properties is not appropriate to be archived (email, archive URL, irc, etc); by contrast, external IDs may be useful to be archived if the target page includes some useful information.
Comment Actions
Well this is certainly something different. IABot's code is not even compatible with Wikidata. This would take a lot of doing.