The commons main page has been periodically unavailable this morning due to the poolqueue for this page filling up.
- http://commons.wikimedia.org/wiki/Main_Page isn't parser cacheable. With debug logging enabled, "Parser output marked as uncacheable" is logged, which comes from Parser::disableCache. That seems to only be called from one place which requires ( $title->getNamespace() == NS_SPECIAL && $this->mOptions->getAllowSpecialInclusion() && $this->ot['html'] ) to be true. I don't see anything Main_Page related in the special namespace, so not sure what's going on there? It also results in "don't cache" headers for squid.
- The poolcounter makes a lot of sense for hot / rapidly changing pages that can be parser cached. One apache gets the lock, all others queue up behind it, or after 50, return an immediate error. For a popular page that can't be parser cached, it really sucks. All requests are serialized and stack up, resulting in very page load times, or immediate errors.
- Pages like this are insanely easy to DOS - either deliberately with minimal effort or just due to natural traffic spikes.
- Main Pages should probably all be parser cacheable and/or we should disable use of the poolqueue on pages that aren't. It currently seems like this isn't determined until after parsing however.