Problems with image searching (particularly an issue for Commons, but elsewhere too):
- The current search reacts to every keyword and might give surprising results. For example: A search for "cucumber" delivers not only a cucumber, but also its use as a sex toy.
- When terms collide you won't find what you want to find. For example: If you search for "monarch" you will get hundreds of images of a butterfly, but very few results concerning monarchy.
The basic idea is to improve on this by clustering related search results. Roughly, this could work like this:
- The search works as usual and grabs all results by keyword.
- It looks at the categories of the results. If it finds multiple images from different parts of the category tree it will split the results into groups, labeling them after the lowest parent category. This means that it would form clusters using the categories to group the results.
- Instead of showing a list of images it would display these groups, which can be expanded.
Clustered search would not only be much more useful, but it would also solve (in relation to searching) the problem which the WMF's image filter is supposed to address - but without any need to specially classify or tag individual images.
There is a more detailed explanation (and an image mockup) at https://meta.wikimedia.org/wiki/Controversial_content/Brainstorming#Clustering_for_search_results_on_Commons
Bugzilla is not a good format to discuss this idea (it doesn't even have a Preview button!!), but we'd like to put it on developers' radar, and get some feedback if possible. Please feel free to leave comments on Meta in addition to Bugzilla.