User Details
- User Since
- Aug 21 2015, 1:38 PM (422 w, 6 h)
- Availability
- Available
- LDAP User
- Unknown
- MediaWiki User
- James Hare (NIOSH) [ Global Accounts ]
Apr 8 2019
Mar 6 2017
For whatever reason, the cron job I had set up to run the ingest script on a daily basis did not work. So I just run the script every time I need data, based on the last time the report was run. Definitely not ideal.
Nov 22 2016
@ezachte I would be interested in the data for full size images vs. thumbnail/icon, but this API focuses on the number of times a playable media file has been played. Though extending it to include this additional data should not be difficult.
Nov 21 2016
That is correct @ezachte, or at least those are the columns I went with (so I hope they're correct!).
Nov 15 2016
@MusikAnimal do we have any fair use sound recordings on Wikipedia? In any case, the code filters out any URL path that isn't /wikipedia/commons and further filters out anything that isn't playable media (i.e. video and sound).
Nov 2 2016
All the past data has now been ingested, 1 January 2015 to 1 November 2016. There will be daily ingests that take place at around 20:00 UTC each day.
Nov 1 2016
Going to close this task as complete since the metrics crunching is now underway; T149642 is the task for implementing the UI.
Oct 31 2016
The API documentation is located at P4339 if you are interested in developing tools around these metrics. (Pinging @MusikAnimal.) Note that the dataset is incomplete; work to ingest all the data from 1 January 2015 to present. As of writing the ingest script is around midway through April 2016.
Oct 26 2016
I have created the infrastructure for logging the play counts in a central database. Currently I am working on ingesting all the historical data going back to the first day of data on January 1, 2015. Once that is done there will be a daily script that adds the prior day's values.
Oct 18 2016
Oct 17 2016
Jun 13 2016
Jun 6 2016
May 13 2016
At minimum, you should be able to manually select wikis where you know there is activity. For example, I know that at my trainings we edit both English Wikipedia and Spanish Wikipedia. Selecting the databases by hand is probably more performant than doing a scan of all of the databases, even if there is an off chance we miss edits on the Portuguese Wiktionary or whatever.
May 11 2016
Apr 25 2016
I didn't get this error for other articles from that journal, and I got the same error at different times of day.
Apr 24 2016
Apr 15 2016
I tried to "stress test" the system and managed to get 114 uploads in before it started throwing the error message.
Dec 18 2015
It magically works now. You can close this if you'd like. Though it's worth looking into what causes a file to be rejected and subsequently accepted.
Incidentally, the file seems to have uploaded to Phabricator just fine!
Nov 5 2015
At the National Institute for Occupational Safety and Health we are working on incorporating our datasets into Wikidata. One of the benefits of this is that it allows for translation of our content with less effort, since the relationships ("X causes Y") are language neutral and people only need to translate the terms ("X" and "Y").
Oct 30 2015
@kaldari captures it well. Measurements have inherent uncertainty, since measurement instruments such as scales can only measure out to so many decimal places. So if a scale tells you something has 3.24 grams of mass, it could be 3.239 or 3.241, but the scale cannot measure at that level of precision, so it rounds. I would say that anything that is a measurement should have a default uncertainty of what the last decimal point is (for my example +/- 0.01). An option to override would be appropriate, of course.
Sep 4 2015
Aug 27 2015
We already have Citoid and the citation generation function in VisualEditor. Wouldn't the most effective approach be to build a version of that for the wikitext editor?
Aug 25 2015
As part of my work with the National Institute for Occupational Safety and Health (part of the U.S. Centers for Disease Control and Prevention), I've drawn up some leaflets that take a different approach to training Wikipedia editors. This is because I found the current materials focus heavily on wikitext markup, whereas I would like to get beyond that and have people focus on meaningful content contributions that are facilitated by VisualEditor. The flyers are available below for all to use: