When dumping the entities to RDF, we try to load the data in batches, so that we do not call the database for each entity separately. However, this does not work for MediaInfo, since it has this:
public function getTitleForId( EntityId $id ) { return Title::newFromID( $id->getNumericId() ); }
Which produces database fetch for every entity ID. Backtrace leading to this is:
frame #0: Title::newFromID(id=13) at /var/www/w/includes/Title.php:471 frame #1: Wikibase\MediaInfo\Content\MediaInfoHandler->getTitleForId(id=Wikibase\MediaInfo\DataModel\MediaInfoId) at /var/www/w/extensions/WikibaseMediaInfo/src/Content/MediaInfoHandler.php:187 frame #2: Wikibase\Repo\Content\EntityContentFactory->getTitleForId(id=Wikibase\MediaInfo\DataModel\MediaInfoId) at /var/www/w/extensions/Wikibase/repo/includes/Content/EntityContentFactory.php:149 => array_map(callback=array(2), array(9)) (internal function) frame #3: Wikibase\Dumpers\RdfDumpGenerator->preBatchDump(entities=array(9)) at /var/www/w/extensions/Wikibase/repo/includes/Dumpers/RdfDumpGenerator.php:118 frame #4: Wikibase\Dumpers\DumpGenerator->dumpEntities(entityIds=array(9), dumpCount=unknown type: 10) at /var/www/w/extensions/Wikibase/repo/includes/Dumpers/DumpGenerator.php:295 frame #5: Wikibase\Dumpers\DumpGenerator->generateDump(idPager=Wikibase\Repo\Store\Sql\SqlEntityIdPager) at /var/www/w/extensions/Wikibase/repo/includes/Dumpers/DumpGenerator.php:262
Probably we may need some kind of batching strategy do deal with it? It's different from other entities because other entities can create Title from EntityId without DB access. Ironically, the code that uses Titles - PageProps::getProperties - actually needs IDs, so it immediately converts Titles to IDs. Moreover, I'm not even sure page properties that we dump even exist for NS_FILE+MediaInfo entities. So we're wasting a lot of DB queries for nothing.