Page MenuHomePhabricator

Add URL whitelist for CAPTCHA
Closed, ResolvedPublic

Description

Author: dunc_harris

Description:
A CAPTCHA has to be solved by new users/logged-out users if they create a new external link. This is to prevent spamming, OK great. So why do it to websites that reliable?

examples:

media websites, e.g.

  • bbc.co.uk
  • nytimes.com

government websites
-.gov
-.gov.uk

  • .mod.uk

misc, e.g.

  • jstor.org

etc?

Now links to Wikimedia websites are be approved without the CAPTCHA. So presumably there is a list (somewhere) of approved websites, and this can be extended?

If so there needs to be a process whereby such sites are suggested, approved, and then added to the system.


Version: unspecified
Severity: enhancement

Details

Reference
bz11585

Event Timeline

bzimport raised the priority of this task from to Low.Nov 21 2014, 9:55 PM
bzimport set Reference to bz11585.
bzimport added a subscriber: Unknown Object (MLST).

axel9891 wrote:

I think the blacklist http://meta.wikimedia.org/wiki/Spam_blacklist is (or should be) involved here. Blacklist sites can never be entered anyway. Maybe a whitelist of good sites exists somewhere?

alexsm333 wrote:

It looks like this feature was implemented since 2007 so this bug can be closed.

http://www.mediawiki.org/wiki/Extension:ConfirmEdit#URL_and_IP_whitelists