When I was looking at this code a couple weeks ago, I realized that running queries in a for-loop seemed a little suboptimal at first glance. Well, technically you create a new Comment object, and it immediately does two queries for each object constructed. Anyway, I decided to play around with premature optimization, and I came up with this:
https://github.com/miraheze/mediawiki-extensions-Comments/compare/REL1_27...master
Sorry, I've been too busy with work to learn how Gerrit works -- I gave up after 90 minutes or so of investigating. You're going to have to make do with github's UI, or maybe a call to git remote.
Anyway, what the branch does is reduce all of the queries needed for getComments loading down to one query. Rather than a lot of small queries (2*Ncomments + 1), we get all of the data from Comments LEFT JOIN Comments_Vote LEFT JOIN Comments_Vote (again). Mysql's EXPLAIN SELECT output leads me to think it's reasonably fast.
But back on the subject of premature optimization -- I'm not sure if it really does make it faster or not. We don't really have the data available to test, because Miraheze just installed it on our wiki farm. It's possible that small queries would be faster, depending on server configuration. Either way, I am using my branch in production (and no complaints yet). I have not tested it with other social extensions enabled, either.
So, uh, here's some code you can use, if you want. Have fun!