Page MenuHomePhabricator should have robots.txt
Closed, ResolvedPublic

Description SPARQL endpoint should have robots.txt which prevents robots from crawling the queries if somebody puts direct link to the query on their page.

Event Timeline

Smalyshev raised the priority of this task from to High.
Smalyshev updated the task description. (Show Details)
Smalyshev added a subscriber: Smalyshev.
Restricted Application added a subscriber: Aklapper. · View Herald Transcript

Change 270666 had a related patch set uploaded (by Smalyshev):
Add robots.txt

Smalyshev claimed this task.