As a third-party reusers attempting to understand the quality of a new revision in its context, I would want to know if existing references/citation on an article are of quality as well. If a URL is broken or goes to a 404 page on website, it is a negative signal of quality for a page.
We seek to understand if we can, in realtime, understand if an article has broken references.
To Do
- hit the webpage itself
- go to internet archive, see their DB if 404
- go to google get pagerank score
Acceptance criteria
- Document options and feasibility in a doc, making a point of cost.
- Even if doable, would it worth doing given cost and any added latency?
- Should enterprise be storing this info?