The global robots.txt loaded from '/srv/mediawiki/robots.txt' by robots.php seems to contain an empty line on row 423:
420 Disallow: /wiki/Wikiquote%3AVotes_for_deletion_archive/ 421 Disallow: /wiki/Wikiquote_talk:Votes_for_deletion_archive/ 422 Disallow: /wiki/Wikiquote_talk%3AVotes_for_deletion_archive/ 423 424 # enwikibooks 425 Disallow: /wiki/Wikibooks:Votes_for_deletion 426 #
This is in violation of the Robots Exclusion Standard (http://www.robotstxt.org/orig.html), which defines the empty line as a separator between the records. A bot that adheres strictly to the specification may ignore all directives below that line, since they are missing the corresponding User-agent line, with which each record must start. Effectively, this could disable all projects' custom MediaWiki:Robots.txt, as their contents are appended at the end of the global robots.txt file and are supposed to be in the single large record for 'User-agent: *' that starts on line 147 in the global robots.txt.
For examples, see the robots.txt files of enwiki, bgwiki and meta (any other project is almost certainly the same):
https://en.wikipedia.org/robots.txt
https://bg.wikipedia.org/robots.txt
https://meta.wikimedia.org/robots.txt
While many bots may in fact ignore this invalid empty line, it's still best to adhere to the specifications as much as possible.