@Jdforrester-WMF you said elsewhere this was resolved?
Mon, Mar 18
This is now deployed.
Sat, Mar 16
Fri, Mar 15
Thu, Mar 14
Update, checked this again and still getting a similar error (missing dependencies) with a completely fresh install of vagrant / wikidata role.
Wed, Mar 13
Thanks; I've been thinking about doing this for a while, actually. The zotero format which is the native internal format has separate fields for first name and last name, but it is indeed a bad assumption. Unfortunately since this is coming from Zotero I'm not sure we can fix it on our end in this particular case except by trying to reconstruct the split, or overwriting it maybe?
Mon, Mar 11
Thu, Mar 7
Somewhat amusingly, the anchor is kept in the DOI (when it shouldn't be) but not the url with this one: https://en.wikipedia.org/api/rest_v1/data/citation/mediawiki/https%3A%2F%2Fwww.jstor.org%2Fstable%2F10.14321%2Frhetpublaffa.21.2.0279%3Fseq%3D1%23page_scan_tab_contents
Wed, Mar 6
Mon, Mar 4
They've whitelisted our IPs on request, so this is now resolved.
It's doing it again :). Not sure if I should create a new ticket or just re-open this one, but same issue.
Thu, Feb 28
Relevant comment from zotero: https://github.com/zotero/translators/issues/1092#issuecomment-468308989
I'm thinking this might make sense as two different tickets: for your use case @Pintoch you want the graph of the actual content language fall backs for all all the individual wikis, correct? And this would be a MediaWiki-API thing.
On thinking on this more, I think we can be quite greedy with label; we can set it at least in the guessed text language, user language, and English. This is because we're dealing with publications published in a particular language when we create items, like books and journal articles, where an untranslated label wouldn't necessarily be a bad thing and even may be the most correct thing; i.e. for edition or translations items, you actually might want the title as published, not the translated title (as opposed to book, where you might want the name of the book as it was published in each given language.)
Wed, Feb 27
@Lucas_Werkmeister_WMDE would it make sense to expose the commented out equivalence information in https://github.com/wikimedia/mediawiki-extensions-Wikibase/blob/master/lib/includes/WikibaseContentLanguages.php#L135 ? In case we get back any language codes like those?
Tue, Feb 26
So it turns out that JSTOR is just blocking us. They've done this in the past: T88323
I agree it seems a safe bet, although zotero (who we're using for all the scraping now) is a little more cautious about using metadata that appears in the head, unless there's a custom built translator for it, because you can't guarantee the metadata points to the article itself and not something else on the page, like a comment for example.
Mon, Feb 25
Sun, Feb 24
Feb 15 2019
Both DOIs seem to be working in production but it appears I forgot to write a test for this, so I'll re-open until that's merged. :).