(and probably listen to robots.txt in most cases as well)
In November 2016, a weblinkchecker bot on Tool Labs requested many pages from http://www.minorplanetcenter.net/, at a rate of > 100 req/minute. This caused excessive load on the web server, and resulted on Tool Labs being blocked from accessing the site.
Weblinkchecker should reduce the number of requests to a single site, preferrably to something along the lines of one every 10s or so. There are a large number of links to check, and pages different websites can reasonably be requested in parallel. The overall slowdown shouldn't be too significant.
Access log from external site: F4978348 (only visible to selected users)