Author: dunc_harris
Description:
A CAPTCHA has to be solved by new users/logged-out users if they create a new external link. This is to prevent spamming, OK great. So why do it to websites that reliable?
examples:
media websites, e.g.
- bbc.co.uk
- nytimes.com
government websites
-.gov
-.gov.uk
- .mod.uk
misc, e.g.
- jstor.org
etc?
Now links to Wikimedia websites are be approved without the CAPTCHA. So presumably there is a list (somewhere) of approved websites, and this can be extended?
If so there needs to be a process whereby such sites are suggested, approved, and then added to the system.
Version: unspecified
Severity: enhancement