It sometimes happens that a user is discussed unfavorably due to disruption, sockpuppetry, edit warring, or vandalism.
When they leave (or are banned), the pages they were discussed on are still google-able, which is a major dilemma - in extreme cases we have had to rewrite all their signatures, and often we have to courtesy blank the page, purely to prevent it being spidered.
In a number of cases these users then return (with or without permission), which leads to a further problem as administrators seeking to understand history, cannot easily do so.
Would a token NOSPIDER be possible? A page containing this token would render to an error page or blank, if the viewer was a robot or spider, in some manner (I don't know the best way technically).
(Perhaps one easy way might be, if the viewer is an anonymous IP it goes to a page that says "This page is blocked from spiders, if you are a human please enter this CAPTCHA to view." Most spiders/robots aren't logged in.)
This would be a useful tool to ensure our needs for edit histories and pages to remain useful to administrators in future, and the fair needs of a user not to be googled that way on the rest of their life, conflict less. Rather than having to wholesale edit swathes of the wiki, we could tag certain pages as NOSPIDER and then they would rapidly drop off search engine caches (meeting the best interest of the party) and yet be more often able to be left intact (for us). It would also have the advantage that being invisible to the rendering engine for most users, and very easy to apply, we could actually use it more widely when this problem comes along.