Page MenuHomePhabricator

Blacklisted links should mean the page can't be saved
Open, LowPublic

Description

Author: mike.lifeguard+bugs

Description:
Basically, this is a request to undo r34769 (T3505: Spam blacklist, if match, check previous version and accept if URL present there).

Allowing blacklisted links to remain in the page is bad for a few reasons:

Concerns about reverting vandalism (T3505#56358) are valid, and should be addressed; see T17450: Rollback should be unaffected by the spam blacklist to make *rollback* (only) exempt.

See Also:

Details

Reference
bz16325

Event Timeline

bzimport raised the priority of this task from to Low.Nov 21 2014, 10:26 PM
bzimport added a project: SpamBlacklist.
bzimport set Reference to bz16325.
bzimport added a subscriber: Unknown Object (MLST).

cbm.wikipedia wrote:

One issue with this is that if page A transcludes page B, and B trips the spam filter, then saving A will fail. To make this less bad, the spam filter should be put deeper into the parser code, so that as soon as B is parsed the spam filter is checked against it. That would allow a more detailed error message - that page A cannot be saved because page B trips the spam filter with a certain link.

mike.lifeguard+bugs wrote:

(In reply to comment #1)

One issue with this is that if page A transcludes page B, and B trips the spam
filter, then saving A will fail. To make this less bad, the spam filter should
be put deeper into the parser code, so that as soon as B is parsed the spam
filter is checked against it. That would allow a more detailed error message -
that page A cannot be saved because page B trips the spam filter with a certain
link.

That would require a more thorough rewrite, I think, and should therefore be requested separately. This might conceivably be done at the same time as bug 4459.

Isn't this how it works now? I'm sure i've seen issues with bots on en.wiki because they are failing when their is blacklisted links on the page.

He7d3r set Security to None.