Page MenuHomePhabricator

allow per-page exceptions to spam blacklist
Open, LowPublicFeature

Description

I think the spam blacklist should allow exceptions for specific pages.
For example, if www.samplelink.com is blaclkisted, it should be usable on the [[Samplelink]] article
I find this especially useful for shock sites like goatse.cx, which end up being notable enough to have an article, but there might be other uses I haven't thought of.

I know there is a site-wide whitelist, but maybe this list could be adjusted to allow specifying only a given page (or pages) where the link is acceptable.


Version: unspecified
Severity: enhancement

Details

Reference
bz12963

Related Objects

View Standalone Graph
This task is connected to more than 200 other tasks. Only direct parents and subtasks are shown here. Use View Standalone Graph to show more of the graph.
StatusSubtypeAssignedTask
OpenFeatureNone
OpenNone

Event Timeline

bzimport raised the priority of this task from to Low.Nov 21 2014, 10:02 PM
bzimport added a project: SpamBlacklist.
bzimport set Reference to bz12963.
bzimport added a subscriber: Unknown Object (MLST).

As a side note, the behavior would be similar to [[MediaWiki:Bad image list]] (that is, such a mechanism is already implemented)

mike.lifeguard+bugs wrote:

(In reply to comment #0)

I know there is a site-wide whitelist, but maybe this list could be adjusted to
allow specifying only a given page (or pages) where the link is acceptable.

Is there a reason the pre-existing whitelist mechanism is insufficient for this purpose? I really don't think this is a problem & I'd recommend WONTFIX.

The site-wide whitelist allows the listed urls for all the pages of the side, afaik; it is meant to override the global blacklist (that is, it is a site-wide list exceptions for meta-site blacklisted urls, rather than single-page exceptions for the site-wide blacklist). Now, I might be wrong, but I'm pretty sure I asked some people about this before I submitted this bug. If the whitelist does indeed allow specifying only one or a few pages where an url is allowed (like in [[MediaWiki:Bad image list]]), then this should at best be closed as WORKSFORME. Otherwise, I think it is a worthy enhancement.

mike.lifeguard+bugs wrote:

(In reply to comment #3)

The site-wide whitelist allows the listed urls for all the pages of the side,
afaik; it is meant to override the global blacklist (that is, it is a site-wide
list exceptions for meta-site blacklisted urls, rather than single-page
exceptions for the site-wide blacklist).

No, it is /of course/ used for exceptions to blacklisting on the local blacklist! This is often done for domains which have been spammed, but the main page (domain.com/index.html or something) is needed as a link on the article about the site/organization/whatever.

Now, I might be wrong, but I'm pretty
sure I asked some people about this before I submitted this bug. If the
whitelist does indeed allow specifying only one or a few pages where an url is
allowed (like in [[MediaWiki:Bad image list]]), then this should at best be
closed as WORKSFORME. Otherwise, I think it is a worthy enhancement.

It doesn't allow that currently, but I just don't think it's useful. I can't think of a case where it's needed.

(In reply to comment #3)
No, it is /of course/ used for exceptions to blacklisting on the local
blacklist! This is often done for domains which have been spammed, but the main
page (domain.com/index.html or something) is needed as a link on the article
about the site/organization/whatever.

Oh! then I guess I didn't investigate the issue well enough back when I submitted this :)

It doesn't allow that currently, but I just don't think it's useful. I can't
think of a case where it's needed.

Well, it would be useful for shock sites as I noted in my first comment. And possibly there could be other uses (e.g. sites we could link to from their article, but which generally would be a bad idea to link to from other pages)

mike.lifeguard+bugs wrote:

(In reply to comment #5)

Well, it would be useful for shock sites as I noted in my first comment. And
possibly there could be other uses (e.g. sites we could link to from their
article, but which generally would be a bad idea to link to from other pages)

Yep, these use cases are already covered adequately by the whitelist, I think.

(In reply to comment #6)

these use cases are already covered adequately by the whitelist, I think.

Even when the spam is done using the main page of the domain? I'm asking this especially because shock sites (such as goatse.cx) usually only have the main page (I might again be wrong, as I'm not an expert in the subject :)).

mike.lifeguard+bugs wrote:

(In reply to comment #7)

(In reply to comment #6)

these use cases are already covered adequately by the whitelist, I think.

Even when the spam is done using the main page of the domain? I'm asking this
especially because shock sites (such as goatse.cx) usually only have the main
page (I might again be wrong, as I'm not an expert in the subject :)).

Right, but you can whitelist it, add the domain, and then remove it from the whitelist. Until bug 16325 is fixed. After that... dunno. In that case maybe. Then the issue becomes whether you want to actually link to goatse.cx on [[goatse.cx]]... I'm not sure that we do (and in fact we don't do that). I really really don't see the point in implementing this, but I'm done commenting on it. Again, I'd suggest that this be marked WONTFIX.

Ok, I don't think it's critical to have those links, I just thought that not having them could be considered some sort of censoring. If noone else finds this relevant, then let's have this WONTFIXed... at least it's one less bug for the devs to care about :)

This might be sensible, but isn't necessarily high-priority.

Aklapper changed the subtype of this task from "Task" to "Feature Request".Feb 4 2022, 11:01 AM
Aklapper removed a subscriber: wikibugs-l-list.