Page MenuHomePhabricator

Come up with a NSFW filter for images in NewcomersTasks
Open, Needs TriagePublic

Description

It seems to me that newcomers will be very surprised, sitting at work, with such content on their page. I know about the principle of disclaimer, but in our case it can be frustrating.
F34451984 (NSWF)

Event Timeline

Restricted Application added a subscriber: Aklapper. · View Herald Transcript

Thanks, @Iniquity. We've talked about this a little bit, but haven't done anything to filter out articles that might not be best as the first articles for newcomers to see. If there were a template or category that was on such pages, it would be easy to filter them out, but I don't know that such a uniform template/category exists. I understand the idea of NSFW filters on images, but in this case, I think what we would be trying to filter out are articles that may contain such images (or even surprising content that is not images). Maybe we can think about this more in the future.

Thanks, @Iniquity. We've talked about this a little bit, but haven't done anything to filter out articles that might not be best as the first articles for newcomers to see. If there were a template or category that was on such pages, it would be easy to filter them out, but I don't know that such a uniform template/category exists. I understand the idea of NSFW filters on images, but in this case, I think what we would be trying to filter out are articles that may contain such images (or even surprising content that is not images). Maybe we can think about this more in the future.

Yes, I agree that there may be inappropriate content other than images. For example, we have this template (https://ru.wikipedia.org/wiki/Шаблон:Ненормативная_лексика), which is placed on articles containing obscenities and obscene vocabulary. I might start a discussion about the same template for articles with nude images.

Getting local templates will be too difficult to deal with. It is already very difficult to find which templates we have to use for newcomers tasks on most wikis! The example given for Russian Wikipedia only has 6 interwiki links total for Wikipedia. If we decide to remove what's considered as NSFW options (which can vary from one human to another and lead to potential censorship), we need to find a centralized option.

Filtering out these images should be done from Commons, maybe using https://commons.wikimedia.org/wiki/MediaWiki:Gadget-NSFW.js like already suggested. If any NSFW image is in the article, then we could exclude the article.

Related: T264045: Not Safe for Work (NSFW) media Classifier for Wikimedia Commons.

Ideally, NSFW articles would be flagged in the search index, but that seems way beyond what's plausible for us to do.

Another, less ideal but more realistic option would be to store the image's NSFW state somewhere (MCR? page_props? T282585: TDMP DR: Provide for asynchronously-available MediaWiki parser content fragments / components might be relevant although that's about properties derived from page content, and files are handled very differently), expose it in imaginfo API requests, and then we could filter on the client side like we do for protected articles.

All this is blocked on T279416: Deploy Image content filtration model for Wikimedia Commons, though (the NSFW gadget uses WDQS, I doubt we want to do that in production frontend code).

MMiller_WMF added a subscriber: Miriam.

@Miriam -- FYI this is another use case for NSFW work on images from Commons. Beyond just screening out suggested images for the "add an image" task, such a filter could be used to screen out any articles from any task types (e.g. "add a link") that contain NSFW images.

We have a case at ar.wp where it is suggested to add an adult movie to an article about sexuality.

image.png (913×540 px, 127 KB)

Is the appearance of such content taken into account in the Image-Suggestions? :)

The community is not likely to accept what is proposed here.

The community is not likely to accept any task to decide what images should be included or exclude from filtering, nor is the community likely to accept any mechanism which default-blocks content anywhere on the platform.

You should note that "NSFW" is only the second most requested filtering. The number one filtering demand is religious - and particularly images of Muhammad.

The community has a particular dedication to our educational mission, a particular dedication to neutrality, and by extension a particular dedication to our NOTCENSORED policy. We do not accept demand to remove, filter, or hide religious images based on culturally subjective demands of some people to restrict other people's access to religious imagery. And similarly, we do not accept demand to remove, filter, or hide religious images based on culturally subjective demands of some people to restrict other people's access to allegedly "NSFW" images. Note that when such a filter was advocated by the WMF, there were demands that any images of women be filtered. The Burka standard.

Anyone trying to implement filtering on "newcomer tasks" or anywhere else is sure to be get verbally beaten down by the Muhammad, Burka, and other arguments. Such proposals always fail because they are based on an inherently subjective cultural opinion about which images should or should not be filtered.

It is not acceptable for anyone here to attempt to ban "newcomers" from working on religious articles - including the Muhammad article with images of Muhammad. And if some "newcomer" is adding images to cancer articles, then yes, they absolutely should be getting "NSFW" images of naked breasts for the Breast cancer article.

It is not acceptable for anyone here to attempt to ban "newcomers" from working on religious articles - including the Muhammad article with images of Muhammad. And if some "newcomer" is adding images to cancer articles, then yes, they absolutely should be getting "NSFW" images of naked breasts for the Breast cancer article.

The main flaw in your reasoning is that you assume that the user himself chooses the content that user will view. This is not the case with NewcomersTasks. The second point is that you incorrectly define the word “censorship”; it has nothing to do with this idea.

@Iniquity I don't know how much experience you have with the history of community discussions in this topic area, but I have quite a bit. I wan't arguing that we shouldn't filter content. I was summarizing the consistent historical outcome of international and major wiki discussions in this topic area. You appear to have paid little attention to what I wrote, you didn't address or even dispute any of it. The history clearly indicates that any work on this Phab task would be a complete waste of time, there is no chance the community would accept it. The community is likely to get rather hostile if they discover the Foundation attempted to BAN some contributors from improving certain articles, just because someone had a negative opinion regarding some of our article content and and slapped it on a censorship list.

Regarding your comment that your proposal is not "censorship", that's a claim I've commonly seen in past community discussions about filtering. That claim has been consistently rejected by the community. It does not matter that you are not proposing complete and total censorship, and it does not matter that you are only proposing to filter content in some locations for some users. The consistent community result is that this kind of filtering is "censorship", and that incomplete or partial censorship is still "censorship".

There is no possibility the community will allow some outside entity to impose their opinions on which content should or should not be filtered. And there is no possibility that the community is going to accept the task of endlessly warring over which opinion to impose for each piece of content. It would be a massive hellhole of inflammatory disruption and distraction from work on the encyclopedia.

Such debates almost invariably begin with filtering advocates imaging it's simple and obvious what should be filtered. Such debates typically end with filter advocates realizing their position is hopeless because potential allies want to filter the "wrong" things. Are we going to filter images of Muhammad? Depictions of other gods and prophets? Breasts? Bibles? Bikinis? Holy land that an indigenous tribe says can't be photographed? Holy land that some cult or random yahoo claims can't be photographed? All women not in full body Burka cover? Dead bodies in photos of war crimes? 9/11 Twin Towers with people dying inside? Mormon body-suit underwear? Medical images of open body "gore"? Dildos? Bondage gear? A woman in bondage? A man in bondage? In handcuffs? Air crashes? Miniskirts? Gun violence? A boxer with a bloody grossly bruised face? A woman crime victim with a bloody face and black eye? Male body builder in just a tiny bikini bottom? Heck, some religious beliefs object to any imagery of humans. How about if flat earthers find any globe image offensive? The Hiroshima and Nagasaki atomic blasts? A fashion advertisement of an underage model in an arguably "sexy" outfit and an arguably suggestive pose? Little beauty pageant girls glamoured up like prostitutes? Slaves with horrific whipping scars? A n actual prostitute in prostitute-wear? An actress at the academy awards wearing the exact same outfit? A race horse that VERY visibly male? Wild animals mating?

The list is endless, and it doesn't matter how you answer those cases. Any potential "allies" in that debate will almost invariably sabotage any possibility of you winning the filter-debate because they know your answers are wrong.

Alsee, you can make your point without using sexual or violent references in your message.

Please re-read the Code of conduct, where gratuitous or off-topic use of sexual language or imagery is highlighted as unacceptable.