From Sam Walton: The first is historical data. It would be incredibly helpful to be able to search for the number of external links at some date in the past. Presently, as I mentioned, we just record the value every couple of weeks. I guess that a good way to do this would be to run the tool through some Wikipedia dumps for old data over the past year or two and then set it up to run and record values automatically every week or two for certain URLs. (There's probably a better way to do this!) The ability to display these values in a simple graph would also be very useful.
In part, this wouldn't have to be from dumps: it could run like the current queries using Hay's tool, just updating a repository/record of these on a regular basis.
In my mind, I am imagining something like https://tools.wmflabs.org/glamtools/baglama2/index.html#
The dumps part was so that when we start using it we have some data from before, not just data from when we start. I agree that after that it could just run and record on a regular basis.