Special:Allpages would be a great page to let search engines crawl,
for smaller sites.
Allow me to make the case that one should be able to make
Special:Allpages spiderable. Currently it is _hardwired_
noindex,nofollow, just like the other Special pages,
$wgNamespaceRobotPolicies won't help as it is hardwired in
SpecialSpecialpages.php and even if $wgNamespaceRobotPolicies could be
used, one would like to limit the granularity to just Special:Allpages
and keep the rest of Special: set to noindex,nofollow.
On the Main page the first link I make is to
expecting users and search engines alike to use it.
Sure, other wikis might have a vibrant tree of information. However
http://radioscanningtw.jidanni.org/ is more of a flat list, with many
categories that don't need pages just to say they represent e.g.,
486.3785 MHz. I like my structure, and users can see all the content,
but search engines can't! Anyways,
would have been the perfect way to get it indexed, were it not for
some assumption that all Special pages should be noindex,nofollow. No
I do not wish to maintain my own private version of
SpecialAllpages.php, I'm just giving an observation.