Looking through the access logs in my local wiki, I noticed that Special:Whatlinkshere and Special:Recentchangeslinked (including all their subpages) where downloaded by spiders. The files already contain the "noindex" attribute, so the spiders don't store the information, but the server still has to create them.
I propose that these pages are added as disallowed to the robots.txt file, to reduce server load and needed bandwith.
Version: unspecified
Severity: enhancement