Page MenuHomePhabricator

Allow hiding certain (NSFW etc) images by default and letting users explicitly expand them
Open, Needs TriagePublic


For educational purposes, Wikipedia contains images of genitalia or illustrations of sexual activities. However, not all readers are comfortable encountering such content. Therefore, I have an idea that will allow such content to remain on Wikipedia: Blur the image, and in the foreground, have a short warning with a display button, like Reddit.

Reddit implementation (source: Google):

Proposed WP implementation (clicking anywhere on the image displays it):

How would images be marked?
Specific categories that are applied to such image pages automatically mark the image as NSFW. Moreover, specific NSFW images can be marked via the page (e.g. with a template). There is also MediaWiki:Bad image list.

Opt-out of warning via Preferences (for logged in users), perhaps a checkbox for logged-out users?

Event Timeline

Batreeq created this task.Jul 1 2018, 2:40 AM
Restricted Application added a subscriber: Aklapper. · View Herald TranscriptJul 1 2018, 2:40 AM
Batreeq triaged this task as High priority.Jul 1 2018, 2:53 AM
Batreeq updated the task description. (Show Details)
Batreeq awarded a token.
Batreeq renamed this task from NSFW Images to NSFW Images on WP.Jul 1 2018, 2:55 AM
Batreeq updated the task description. (Show Details)
Batreeq rescinded a token.
ToBeFree added a subscriber: ToBeFree.EditedJul 1 2018, 3:54 AM

Comment by non-developer: For logged-in users, permanently disabling this feature should be as easy as clicking "show NSFW image", "do not censor images for me again" in a row.

Note that on the English Wikipedia, this idea is neither new nor welcomed as default so far: wikipedia:en:Help:Options_to_hide_an_image#Hide_all_images_until_click_to_view

The following script is licensed Creative Commons: Attribution-ShareAlike, just like all these English Wikipedia pages. You may want to copy it and use it for your own purpose:

Edit: You also need

ToBeFree raised the priority of this task from High to Needs Triage.Jul 1 2018, 3:54 AM
stjn added a subscriber: stjn.Jul 2 2018, 4:38 AM

Such change, if it would be global, must be discussed, see links in:

It was discussed before in 2011 and was not developed further for some reason, so any further development would have to require a newer poll (since 7 years is already quite long ago).

Note that a similar idea was discussed in 2011, see e.g. here and here.

Not an existing Beta-Feature hence removing tag. Sounds like something someone could write an extension for if they want such functionality for their wiki.

Aklapper renamed this task from NSFW Images on WP to Allow hiding certain (NSFW etc) images by default and letting users explicitly expand them.Jul 2 2018, 2:07 PM
Batreeq added a comment.EditedJul 5 2018, 6:00 AM

I reviewed the pages, however, they were long and somewhat confusing. In a few sentences, can someone explain why this idea was discarded? Thank you!

Hi :) I will try. This is my opinion. This is not a neutral summary.

What is "controversial"? What is "bad"? Who decides what needs to be removed?

An image of the flying spaghetti monster is insulting to my religion! I want all images of flying spaghetti monsters removed from Wikipedia.
Images of nude people in articles about "Man" and "Woman" offend me! I do not want to see a man when opening the Man article!
I am a vegetarian. I do not want to see images of meat and ham. I do not want to see animal pictures if the animals are not happy! I do not want to see factory farming!
I hate vegetarians. I hate vegetables. When I see images of apples and bananas, I need to vomit. I do not want to see images of bananas and apples.
I am a 5 year old child. I do not want to see images about war. War traumatizes me. I do not want to see images/screenshots of games that are unsuitable for my age.

Who decides what gets censored?
Who decides what is unsuitable for work?
Why don't we delete these images instead of hiding them?
Wikipedia is not censored.
To change the behavior of Wikipedia, please discuss on Wikipedia. You have requested a controversial change that affects all users. If you want to see it implemented, please discuss with these users. Phabricator is not the place to do so. Phabricator should be used when the discussion is over.
If this is about the English Wikipedia, please discuss on the English Wikipedia.
Please note that the topic has already been discussed. You are not the first person who has requested this:

Yes, this is point 2 of a huge list. This appears at the top. This may have a reason. Please also have a look at the first point of the list.

This does not mean that this feature request is completely invalid. It only means that you will not be able to evade/override discussion on the English Wikipedia by opening a Phabricator ticket. :)

There is now a NSFW image classifier running on Cloud Services, and from my observations it has proven to be a very effective algorithm. It needs to be in production if we wanted to use it in an extension. That is tracked at T214201: Implement NSFW image classifier using Open NSFW but unfortunately it has lost traction and is no longer on the road map. Perhaps the extension itself could include the classifier. The idea is to pre-store the scores, then I suppose there's a hook we could tie into to hide the relevant imagery on page load, as opposed to retroactively hiding them which would be much too slow. Note the extension would need to also live on Commons since that's where most images live, and I guess there could be a configuration variable or something to disable the auto-hide feature itself, since Commons specifically would not want it.

Note also there has been much debate on the name "NSFW". Some cultures do not consider the same images to be NSFW as Westerners do, for instance. Of course, each community will need to opt-in to the auto-hiding feature, and even then it would probably need to be a preference. As mentioned above, English Wikipedia in particular has continually rejected this idea.

I personally have no interest in the auto-hiding effort, specifically, since I know many/most communities won't accept it (understandably). I would however love to have the scores stored in a database so that AbuseFilter could make use of them, and along with various other heuristics we can put a stop to image vandalism. So if anyone wants to work on this proposed extension, let me know :)

I cordially support the project.

I also preffer checkbox for logged out users

Imtiaz added a subscriber: Imtiaz.Apr 30 2020, 11:43 AM
SHEKH added a subscriber: SHEKH.May 3 2020, 8:17 AM

One more NSFW filter: I generally like the idea of this filter, if it is to decode as Not Safe For Work.

Added several possible projects, delete if not for you :)

Restricted Application added a project: Commons. · View Herald TranscriptSep 7 2020, 6:38 AM
Iniquity added a subscriber: putnik.Sep 7 2020, 6:40 AM