Quarry should have a robots.txt to control well-behaving bots. At least the login route should be blocked, but I'm sure there are some more (e.g. forking?).
Description
Description
Event Timeline
Comment Actions
The utility of a public web search returning cached SQL query results from Quarry seems pretty low. I suppose an argument could be made for public web search making finding an existing query to fork easier, but the app does have an internal search for that use case. I would be tempted to set a Disallow: / policy.
Comment Actions
Mentioned in SAL (#wikimedia-cloud) [2025-06-23T10:49:20Z] <taavi> deploy deny-all robots.txt file T397502