Currently, if [[MediaWiki:Robots.txt]] exists, it is used as this wiki's robots.txt. However, this method blocks the way for developers to introduce centralised changes to robots.txt, such as blocking of new spiders. Therefore, I propose to build it from two parts: centralised one, containing user agent rules and prohibition from indexing /w/, and content of MediaWiki:Robots.txt, if present. Centralised part could be accessible to shell users, or be mirrored from Meta, like www portals currently do.
Version: unspecified
Severity: enhancement
URL: http://en.wikipedia.org/robots.txt