It seems to me that newcomers will be very surprised, sitting at work, with such content on their page. I know about the principle of disclaimer, but in our case it can be frustrating.
F34451984 (NSWF)
Description
Status | Subtype | Assigned | Task | ||
---|---|---|---|---|---|
Open | None | T293711 [EPIC] "Add an image" Iteration 2 | |||
Open | None | T282712 Come up with a NSFW filter for images in NewcomersTasks |
Event Timeline
There is a @putnik 's script that can be used as a base for thinking :)
https://commons.wikimedia.org/wiki/MediaWiki:Gadget-NSFW.js
Thanks, @Iniquity. We've talked about this a little bit, but haven't done anything to filter out articles that might not be best as the first articles for newcomers to see. If there were a template or category that was on such pages, it would be easy to filter them out, but I don't know that such a uniform template/category exists. I understand the idea of NSFW filters on images, but in this case, I think what we would be trying to filter out are articles that may contain such images (or even surprising content that is not images). Maybe we can think about this more in the future.
Yes, I agree that there may be inappropriate content other than images. For example, we have this template (https://ru.wikipedia.org/wiki/Шаблон:Ненормативная_лексика), which is placed on articles containing obscenities and obscene vocabulary. I might start a discussion about the same template for articles with nude images.
Getting local templates will be too difficult to deal with. It is already very difficult to find which templates we have to use for newcomers tasks on most wikis! The example given for Russian Wikipedia only has 6 interwiki links total for Wikipedia. If we decide to remove what's considered as NSFW options (which can vary from one human to another and lead to potential censorship), we need to find a centralized option.
Filtering out these images should be done from Commons, maybe using https://commons.wikimedia.org/wiki/MediaWiki:Gadget-NSFW.js like already suggested. If any NSFW image is in the article, then we could exclude the article.
Related: T264045: Not Safe for Work (NSFW) media Classifier for Wikimedia Commons.
Ideally, NSFW articles would be flagged in the search index, but that seems way beyond what's plausible for us to do.
Another, less ideal but more realistic option would be to store the image's NSFW state somewhere (MCR? page_props? T282585: TDMP DR: Provide for asynchronously-available MediaWiki parser content fragments / components might be relevant although that's about properties derived from page content, and files are handled very differently), expose it in imaginfo API requests, and then we could filter on the client side like we do for protected articles.
All this is blocked on T279416: Deploy Image content filtration model for Wikimedia Commons, though (the NSFW gadget uses WDQS, I doubt we want to do that in production frontend code).
@Miriam -- FYI this is another use case for NSFW work on images from Commons. Beyond just screening out suggested images for the "add an image" task, such a filter could be used to screen out any articles from any task types (e.g. "add a link") that contain NSFW images.
We have a case at ar.wp where it is suggested to add an adult movie to an article about sexuality.
The article is in ar.wiki:
https://ar.wikipedia.org/w/index.php?title=%D8%B7%D9%84%D9%82%D8%A9_%D9%85%D9%86%D9%8A
The community is not likely to accept what is proposed here.
The community is not likely to accept any task to decide what images should be included or exclude from filtering, nor is the community likely to accept any mechanism which default-blocks content anywhere on the platform.
You should note that "NSFW" is only the second most requested filtering. The number one filtering demand is religious - and particularly images of Muhammad.
The community has a particular dedication to our educational mission, a particular dedication to neutrality, and by extension a particular dedication to our NOTCENSORED policy. We do not accept demand to remove, filter, or hide religious images based on culturally subjective demands of some people to restrict other people's access to religious imagery. And similarly, we do not accept demand to remove, filter, or hide religious images based on culturally subjective demands of some people to restrict other people's access to allegedly "NSFW" images. Note that when such a filter was advocated by the WMF, there were demands that any images of women be filtered. The Burka standard.
Anyone trying to implement filtering on "newcomer tasks" or anywhere else is sure to be get verbally beaten down by the Muhammad, Burka, and other arguments. Such proposals always fail because they are based on an inherently subjective cultural opinion about which images should or should not be filtered.
It is not acceptable for anyone here to attempt to ban "newcomers" from working on religious articles - including the Muhammad article with images of Muhammad. And if some "newcomer" is adding images to cancer articles, then yes, they absolutely should be getting "NSFW" images of naked breasts for the Breast cancer article.