Page MenuHomePhabricator

Continue to increase the amount of Wikidata data in Wikisources
Open, Needs TriagePublic


A few Wikisources are moving metadata to Wikidata and then reading it back again with custom Lua modules. English, French, and Spanish have done this for authors (see Q26966406) to some degrees.

This task is to continue this process where appropriate, and so reduce the duplication of data in Wikisource.

Modules need to created for authors and works and integrated into the existing templates for these. There are still lots of questions that remain to be answered about how best to set up certain metadata on Wikidata, but there are lots of things that we do seem to be pretty settled on. For example, a work on Wikisource is a specific edition, and so has a publication date of its own (i.e. not necessarily the date of original publication) and this needs to be displayed in Wikisource. At the moment, it's usually recorded there and at Wikidata.

If this hack session achieves nothing else, even getting a single attribute to be read from Wikidata (when it's left blank in the local template) would be a good job. In doing so, we learn more about what data structures we need, and we reduce the chances of error throughout the system.

[Feel free to edit this description if it's lacking anything.]

Related Objects

Event Timeline

Reedy renamed this task from Continue to increase the amout of Wikidata data in Wikisources to Continue to increase the amount of Wikidata data in Wikisources.Feb 27 2017, 12:05 PM

Hi @Samwilson are you planning to work on this at the Hackathon in Vienna? If not, I will feature this for volunteers to take on.

@srishakatux yes, I will work on this, but it'd be great to do it with other people too so perhaps it's best to open it for volunteers.

Is there work being done on this task? I am curious about it and would like to join in.

@MrSteff there's some current discussion about

If you want to work on some Lua, is not functioning yet, and needs lots of work on things like displaying related work-level data from editions (however, this may be hard, as P629 sometimes currently does have more than one value).

i would lile to see explored the means to map and data process

  1. file starts at IA,
  2. upload file as edition to Commons (usually, though it can be the WS) utilise {{book}}
  3. create index page at the WS, and if gadget utilised then data fields in index are populated
  4. work is transcribed... (time passes)
  5. work is transcluded to main ns
  6. then the wikidata items appears and is populated (often too briefly and it has to be after creation/transclusion to get wikilink in place for data flow)

For the WSes the wikidata item will usually be for the edition(s), whereas for the WPs it is the literary work (generally). [disconnect hinted at in task]

so i see that there is some good scope at WD to look at

  • internal to WD for literary work (parent) <-> edition tools to make that expansion better/easier/quicker
  • inhaling IA data and then being to apply it for a scan/index/book/edition

so much opportunity!


How possible or not possible at all would this task be for me as an utter novice regarding wikidata? I've read through your links but honestly don't understand much.. :/