Page MenuHomePhabricator

Log full URLs that hit the blacklist as well on Special:Log/spamblacklist
Closed, ResolvedPublic

Description

In addition to just the part that matches regex.

Details

Reference
bz55356

Event Timeline

bzimport raised the priority of this task from to High.Nov 22 2014, 2:36 AM
bzimport added a project: SpamBlacklist.
bzimport set Reference to bz55356.
bzimport added a subscriber: Unknown Object (MLST).
liangent created this task.Oct 5 2013, 8:57 PM

I'm not sure if there will be a simple way to do this without breaking b/c...

I'm still not completely comfortable with the idea of logging full (blacklisted) URLs like this. I think this may have been an intentional design decision?

(In reply to comment #2)

I think this may have been an intentional design decision?

No, it was an oversight on my part.

(In reply to comment #2)

I'm still not completely comfortable with the idea of logging full
(blacklisted) URLs like this. I think this may have been an intentional
design
decision?

Anyway in contrast, abuse log (Extension:AbuseFilter) contains every detail of an editing action, even when it's rejected.

Beetstra.wiki wrote:

I think this REALLY should be added ASAP - spammers use redirects to spam their sites anyway, and hits like 'goo.gl', 'ow.ly', and 'tinyurl.com' do not help at all. Having the full link enables us to find what is being linked to, and whether or not the spam problem still exists (please, do 'disable' the links by removing the 'http://'-part, no need to accidentally click a bad link). Thanks! --~~~~

Beetstra.wiki wrote:

Can we please have a fix for this.

In addition, a way to find links that were attempted to be spammed would be a nice addition as well. It is now nigh impossible to find tried to add http://www.xxx.com.

Un-cookie licking.

Change 169314 had a related patch set uploaded by Ejegg:
Log full URLs on spam blacklist hit

https://gerrit.wikimedia.org/r/169314

Ejegg added a comment.Oct 28 2014, 8:24 PM

(In reply to Kunal Mehta (Legoktm) from comment #1)

I'm not sure if there will be a simple way to do this without breaking b/c...

What's your concern re: breakage? Is the content of Special:Log being parsed by things that expect just the matching domain? Or is the worry that regexes shared between wikis running different versions of this extension would be inconsistent? The patch I submitted shouldn't have the latter problem, as it constructs a full-line-matching regex for logs only when the initial regex detects a match.

(In reply to comment #2)

I'm still not completely comfortable with the idea of logging full
(blacklisted) URLs like this. I think this may have been an intentional
design
decision?

Anyway in contrast, abuse log (Extension:AbuseFilter) contains every detail of an editing action, even when it's rejected.

Imo, this would be the best approach : a link to a page with details on the attempted edit similar to the abuse filter one, i.e. with the diff, new wikitext, old wikitext, etc (example). No risk to click the link that way and it can be restricted. Having the link(s) in context would provide much more information on the user intention. Dunno about feasibility or if the AF code can be adapted, though.

Glaisher closed this task as Resolved.Apr 7 2015, 5:32 PM
Glaisher assigned this task to Ejegg.
Glaisher updated the task description. (Show Details)
Glaisher removed a project: Patch-For-Review.
Glaisher set Security to None.
Glaisher removed a subscriber: Unknown Object (MLST).