Page MenuHomePhabricator

Update robots.txt file to disallow POST routes
Closed, ResolvedPublic1 Estimated Story Points

Description

Our logfile analysis shows an uptick in failed GET requests to routes that only accept HTTP POST (adding donations, validating data). The failed attempts look like they came from crawlers. To signal to (well-behaved) crawlers to leave those URLs alone, we should update our robots.txt file