Page MenuHomePhabricator

spider repellent for boring pages
Closed, InvalidPublic


On a small wiki analyzing web logs shows that 90% of the traffic
is search engines indexing the most useless places, like Allmessages,
old revisions, you name it. Every nook and cranny they've crawled

Therefore please increase nofollow links or whatever. Indeed, perhaps
just exclude all except current pages and categories, and keep them out of
Special: and Mediawiki: namespaces too, etc.

I see there is a $wgNamespaceRobotPolicies etc. but little wiki sysops
would not dare to tinker with these, so there should be better defaults.

Version: 1.7.x
Severity: minor
OS: Linux
Platform: PC



Event Timeline

bzimport raised the priority of this task from to Low.Nov 21 2014, 9:29 PM
bzimport added a project: MediaWiki-Parser.
bzimport set Reference to bz8338.
bzimport added a subscriber: Unknown Object (MLST).
Jidanni created this task.Dec 20 2006, 5:23 PM

robchur wrote:

Read includes/DefaultSettings.php:


  • Robot policies for namespaces
  • e.g. $wgNamespaceRobotPolicies = array( NS_TALK => 'noindex' ); */

$wgNamespaceRobotPolicies = array();

Doesn't seem all that hard to me.

koneko wrote:

hmmm, do we have something like a SPECIAL => 'noindex' too ?