Page MenuHomePhabricator

[Epic] Make the Wikimedia Commons Android app ready to work with (multilingual) structured data on Wikimedia Commons
Open, LowPublic

Description

The Commons Android app is a smartphone app that supports uploads to Wikimedia Commons.

It would hugely benefit from the possibility to input information about files in any language during the upload process, in such a way that this information becomes available (and findable) in many languages instantly. This can be made possible by enabling structured data in the upload process. As soon as structured data becomes available on Wikimedia Commons by end 2018, it would be great if the Commons Android app could also support structured data.

Event Timeline

User story:

As an uploader of media files in the Commons Android application, I must be able to add information about my uploaded files in my own language, using multilingual terminology that makes it possible for speakers of other languages to find and understand my uploads as well.

SandraF_WMF updated the task description. (Show Details)Nov 22 2017, 11:27 AM
SandraF_WMF lowered the priority of this task from Normal to Low.Nov 22 2017, 12:55 PM
SandraF_WMF moved this task from Backlog to Team radar on the Community-Relations-Support board.
SandraF_WMF moved this task from Backlog to External tools on the SDC General board.
Lydia_Pintscher moved this task from incoming to monitoring on the Wikidata board.Dec 18 2017, 3:04 PM
Nicolas_Raoul added a subscriber: Nicolas_Raoul.EditedFeb 14 2018, 7:47 AM

User story:

While attending the concert, I upload a picture of AKB48 performing at Fuji Rock.
Besides uploading and adding a caption and description in my language, I must also add "structure" (what used to be known as "categories", among others).

As I am on mobile, searching for all structural elements manually would be too hard. I want the app to suggest me these structural elements. Examples:

  • The picture embeds EXIF latitude/longitude of the concert, so I am suggested "Sanroku" and "Naeba" as the location property.
  • For the "depicting" property, I am suggested "Ski slope" and "concert" (this location is used for skiing during winter and concerts during summer). This is found by retrieving the properties of all nearby pictures, and suggesting the properties (key+value) that are most commonly found one in that area.
  • Structural elements that I have used recently are suggested too.
  • For the "depicting" property, "AKB48" is suggested too, as I have entered it in the caption or description (yes, this means searching structural elements in my language. "AKB48" may be the same in all languages, but my "宇多田ヒカル" becomes "هیکارو اوتادا" for another)

The Android app currently does all of this (but not multilingual).

By the way, some of the problems with Commons categories:

  • Very few are geolocalized, so it is hard to provide relevant suggested categories given a latitude/longitude. With Structured Commons a single SPARQL request would solve the problem.
  • Not multilingual. As you probably know most humans can't speak English.

User story:

I am waiting for my bus, and have 20 minutes to kill.
I touch "Nearby places" in the Commons app, and a map opens showing me the notable places around me that lack a Commons picture.
I walk to the nearest one, "Sabuli town hall", touch the camera icon, take a picture and confirm.
The app:

  • Uploads the picture to Commons,
  • Links it from the Wikidata item (the map shows Wikidata items that lacks a Commons picture, so we know the QID)
  • Adds the category found in the Wikidata item's P373 "Commons category" if it has any (many Wikidata items have a Commons category but lack a P18 image, sometimes because only centuries old images from colonial times are available, or most often because nobody has taken the time to select one).

Most of this user story's features have already enjoyed long-time popularity from the the app's users, with just the last two bullets being in development right now.

User story:

I am looking for pictures of butterflies.
I touch the search bar, enter "butterfly", and the app starts showing me Commons pictures of butterflies, starting from the best-quality images that Commons has, and allowing the user to scroll down ad nauseum.

Both the current Commons and Structured Commons can fulfill this use case... BUT the hard part is sorting by quality.

FastCCI used to provide an easy way to get featured+quality+valued images within a category, but it seems FastCCI is not working anymore, and even when working it was never really able to scale.
Structured Commons will hopefully be able to provide these results with a simple SPARQL query, and faster than FastCCI as it is the kind of query Wikibase is good at.

For the discussions around implementing file captions, see https://github.com/commons-app/apps-android-commons/issues/2297