Looking at http://people.wikimedia.org/~ori/enwiki-waterfall.png (from http://www.webpagetest.org/result/150316_0C_7MB/1/details/), we can see a login.wikimedia.org request that happens late and takes up 270ms. About 200ms of that seems to be the DNS lookup.
Modern browsers do DNS lookups in parallel while fetching the page content. They do this automatically for resources that are referenced from the content (e.g. it happens for bits & upload). However, login.wikimedia.org isn't referenced anywhere in the page itself, as checkLoggedIn gets called from Javascript, which means that this DNS lookup gets serialized and blocked on that CentralAuth script getting fetched and evaluated.
This could be easily be parallelized and shave off a few ms by adding this to head:
<link rel="dns-prefetch" href="//login.wikimedia.org">
@ori points out that meta.wikimedia.org could also be similarly prefetched, as it's used when running campaigns but not referenced anywhere in the original HTML.
There are possible other contenders that we should think of. Note that while major browsers perform link prefetching by default for link targets as well, it's disabled for privacy reasons when viewing a page over HTTPS, so we should explicitly prefetch cross-site links as well, within reason (i.e. not all Wikipedia language variants ;)).