Page MenuHomePhabricator

Do not add robot.txt entries only relevant to one wiki to another wiki
Closed, DuplicatePublic

Description

Currently, the English Wikipedia has many entries at
https://en.wikipedia.org/robots.txt
which are only relevant to other wikis. E.g.:

# ptwiki:
# T7394
Disallow: /wiki/Wikipedia:Páginas_para_eliminar/
Disallow: /wiki/Wikipedia:P%C3%A1ginas_para_eliminar/
Disallow: /wiki/Wikipedia%3AP%C3%A1ginas_para_eliminar/
Disallow: /wiki/Wikipedia_Discussão:Páginas_para_eliminar/
Disallow: /wiki/Wikipedia_Discuss%C3%A3o:P%C3%A1ginas_para_eliminar/
Disallow: /wiki/Wikipedia_Discuss%C3%A3o%3AP%C3%A1ginas_para_eliminar/

refers to pages that only exist on
https://pt.wikipedia.org
so, I don't see why they should be added to en.wikipedia.org (or any other wikipedia).

Maybe I missed something about how these files work?

Event Timeline

He7d3r created this task.Dec 29 2017, 5:27 PM
Restricted Application added a subscriber: Aklapper. · View Herald TranscriptDec 29 2017, 5:27 PM

As far as I know, robots.txt is the same for all projects run by the Wikimedia Foundation and it's inpossible to change robots.txt per project.