Page MenuHomePhabricator

Don't index pages generated by DoubleWiki
Closed, ResolvedPublic

Description

The pages generated by DoubleWiki are index by search engines like Google. So they are duplicates entries for each pages.

It may be good to add noindex,nofollow to the page generated by DoubleWiki.


Version: unspecified
Severity: enhancement
URL: http://www.google.fr/search?q=%22La%20mort%20et%20le%20Malheureux%22%20site:wikisource.org

Details

Reference
bz31726
TitleReferenceAuthorSource BranchDest Branch
Quick hack to work around /etc/novaobserver.yaml bugtoolforge-repos/openstack-browser!8bd808work/bd808/bad-yaml-hackmaster
Procfile: increase gunicorn timeout to 5 minutestoolforge-repos/openstack-browser!7bd808work/bd808/timeoutmaster
Customize query in GitLab

Event Timeline

bzimport raised the priority of this task from to Needs Triage.Nov 21 2014, 11:49 PM
bzimport set Reference to bz31726.
  • Bug 41256 has been marked as a duplicate of this bug. ***

Created attachment 11515
google search

Attached:

google_search.png (539×1 px, 252 KB)

The robot policy setup in the past commit is now override by the Article class, so the "noindex, nofollow" isn't outputted. I have made a new patch that fix this issue: https://gerrit.wikimedia.org/r/#/c/38785/

Thanks for the report!

(In reply to comment #6)

The robot policy setup in the past commit is now override by the Article
class,
so the "noindex, nofollow" isn't outputted. I have made a new patch that fix
this issue: https://gerrit.wikimedia.org/r/#/c/38785/

Thanks for the report!

Merged on the 21st.