The ability to search commit history (and ideally also code review comments) is greatly appreciated by devs. In [[Wikimedia technical search]] and others, we're currently forced to use external mirrors (like gmane.org) because git.wikimedia.org has never been robust enough.
I see that right now some diffusion URLs are indexed by search engines, e.g. https://phabricator.wikimedia.org/rMW23fab68274456f796563a5eac2ab70cb307afe1a , perhaps because it was linked by some task. However, robots.txt disallows crawling of /diffusion/, which makes the indexing spotty and inconsistent. Can we try removing the rule, or figure out something so that the main diffusion URLs are consistently indexed?