User Details
- User Since
- Dec 10 2014, 9:37 AM (442 w, 5 d)
- Availability
- Available
- LDAP User
- Jane023
- MediaWiki User
- Unknown
Aug 13 2021
the useful kaggle link:
https://www.kaggle.com/heesoo37/120-years-of-olympic-history-athletes-and-results
May 11 2021
It’s probably a good idea to have a specific “round table” if only to gather the various issues. Personally I have found listeria extremely useful both for quick checks and long term control of various projects but the decreasing reliability (which I chalk up to exponential growth, more or less fueled by the ease of edits enabled by listeria) is an issue. That said, observing my own behavior over time, I admit that listeria lists are slowly becoming a place to park my queries and I prefer to run queries directly when I am working on a specific corner of Wikidata. That said, it may be useful to have two types of lists, one for more static, “parked” usage, and one for more active usage. I am thinking of a situation where fewer than 3 manual updates a month might trigger parked status.
Mar 2 2021
Sep 27 2020
Sep 26 2020
I disapprove of the action being requested here (I gave a thumbs down token) but I love this discussion. I love it especially because the elephant in the room is nowhere in the (admittedly very thoughtful) comments. Let me start by suggesting that not all wikipedias are created equal and some are considered more equal than others. On a sliding scale of equality, I think I would begin with the Finnish Wikipedia and end maybe with various failed incubator wikis or that story of the hijacked Wikipedia (sorry can't remember which that was). I think the ptwiki has always fascinated me because the European origin of the language is so far away physically from the most speakers of the language. In a way, it is a breeding ground for social unrest on a global scale against colonial views and antiquated notions of what we like to call "reliable sources". I would love to see a demographic breakdown of the abusive IP edits, but sadly that would probably be against our principles to create. Maybe it has been done on a voluntary basis already and if so I for one would love to read the English summary! Otherwise, maybe this would be a good idea to organize: appoint local contributors per major city to conduct local editor surveys.
May 11 2020
May 9 2020
I don't have any development environment installed for wikimedia projects at all. I only use already-created tools, mostly listeria and petscan these days (by Magnus Manske).
OK I subscribed to this task because I watched the streamyard thing (thx Andrew & co.!), but am not sure if we are talking about the same thing. So one of the things I dream of is getting rid of Excel in my workflow process (also because with each laptop/mobile device I have issues finding stuff and picking up where I left off). Ideally I could take an item like the one I want to create (another portrait painting from the same collection, catalog, or artist), generate the quick statements needed to create that one and have it appear in a notepad-like window I can easily update and feed into Quick statements, or better yet - just dump it directly into the QS old-style notepad-like window!
Mar 16 2020
I agree with earlier posts and would say I have zero motivation to contribute to SDoC until such a query service is available to me, because working on paintings I find Wikidata workflows much easier to monitor and lacking that for Commons I can't make any overviews or execute comparative checks.
Feb 11 2020
This is apparently a sub-task of T54564, which is the Bonnie and Clyde problem (see on Wikidata Help:Handling_sitelinks_overlapping_multiple_items). I think the best solution is to radically delete all redirects from Wikidata, certainly not to add badges to them
Jan 28 2020
Dec 10 2019
No to the OAuth issue - OAuth is visibly logged in. I cannot even paste into the window to get the import loaded so I am unable to answer the rest of your question. All edits (before and after with old interface) are correctly logged under my userid.
I noticed this late last night and had been using QS all day. I just switched to the old interface and that is still working. So something must have happened around dinner time? Like 6-9 PM GMT or Western European time.
Nov 1 2019
Having spoken briefly with Envel on Sunday in Berlin after WikidataCon, I think it would be good to break this up into various tasks. We have a largish problem with articles coming in to Wikidata semi-automatically from various Women-in-Red editathons (all languages including English) that are not tagged as human or female. There are various people who work on these in various semi-automated projects. It would be nice to be able to measure these specifically somehow as a group so we can do better at catching them on the day, at the source. Once they are tagged as human and female, they come into "the grand bit bucket that cannot be queried due to time-outs". I propose setting up various measurements to tackle the 2-statement items that need further sorting, and then the 3+ statement items can be set up in various other visualisations. Once you have occupations set up, the same visualisations can be applied per occupation. The more specific the data, the more volunteers there are who are willing to help with improvement tasks. For example it is much easier to improve the item of a women who is a professional tennis player once you know that is her occupation. I think if we set it up correctly, the totals can be generated from "lists of lists" that possibly drill down to item level instead of a grand database query.
Oct 27 2019
Wow this is brilliant! Thanks so much!
So to be clear, if the file was clearly marked with a green border or something, then that would be a prompt for me to go look at it and attempt to substitute it.
Sep 12 2019
As far as reports go, it would be nice for the portrait part of the Sum of all Paintings if we could easily see the missing women in the genealogy chains from circa 1450 onwards. Often we already have the portrait items of the women, but are missing the human items for the women. Meanwhile the men seem to spawn from generation to generation 100% from the male chromosomes and only half of pendant portrait pairs are ingested as data. If we can identify missing wives/mpthers/daughters we can interlink the female portraits to the sitters. Sometimes it goes the other way: we have the portrait as an image illustrating the human item, but no portrait item yet. We want both!
Aug 10 2019
If you want to add depicts it is helpful if you can see the image at the same time, even if it is just the thumbnail as a reminder.
Jul 23 2019
Magnus told me about this tool, which gathers painting items with same creator in "painted by" description. You need to fill in the Q number of the artist which you can create or lookup. https://tools.wmflabs.org/mix-n-match/painters.php
Jun 22 2019
This is not a bug. Everything is working as expected. The problem was entirely on the user side in their interpretation of the error message, which was entirely correct.
Jun 21 2019
No I have not explained it well then. Try to think of it in terms of postal districts or voting districts. As a town grows, generally neighboring villages are annexed, but they do not lose their postal codes or voting district names. This is just as true for the original "core" district as it is for any other municipal district. In this case the core district and the wider municipal area have the same name. That is all.
Yes sorry if the statements seem so ambiguous, this is just one of many cases where such things occur. One is a historic city and one is a current city. The problem probably arises because there is so little left of the original historic center in today's city, due to WWII bombing. The maps of both should make it clear, as do the titles in cebwiki themselves, which simply reflect the Dutch. A municipality has generally over time annexed various lands due to growth. The historic city limits generally follow some historic city wall footprint, though in this case it is hard to tell.
The message you received was correct. There is a help page for this: https://www.wikidata.org/wiki/Help:Merge
I am back and after checking, happy to report I was right. One is the town, and the other is the municipality. They are not the same and should not be merged.
This is a known issue and happens all the time. The wiki has a dup, wikidatan sees it and attempts to merge, failure raises a bug for wikidata. The proper way to deal with it is to 1) assign the property "said to be same as" to one of the wikidata items and on the wiki page of the same item, add that wiki's equivalent of the English Wikipedia merge template. That said, and sight unseen, I know enough about cebwiki to know the granularity of topics is higher than most, if not all, Wikipedias and the mistake is probably in the user's interpretation. In short, without checking, I will assume these should not be merged at all.
Dec 18 2018
Nov 20 2018
No for me it doesn't show up. Also the query should return 74 items and it only returns 69 for me. The same problem exists for the same Q when you run the query against Q58590616 instead: this time I get 78 results when it should be 80. So there seem to be a small group of problematic items that don't get counted each time the query runs.
This is my query which does not return all items:
SELECT ?item ?catcode WHERE { ?item p:P528 [ pq:P972 wd:Q53207781 ; ps:P528 ?catcode]. }
Example missing item: Q31158443. This item was created using the duplicate item gadget, so not sure if that matters.
Nov 14 2018
This appears to be the same issue I reported here. Hope this is just a question of waiting a few days for servers to be in sync, and not months for a bug fix.
Oct 27 2018
The 6 test items uploaded well but these were the first 6 ids (1,2,3,4,5,6) and after upload these were added in decimal format with ".0" after the number. Open refine is great when it works as expected... Meanwhile multichill has also added some images to existing items. Listeria list is here for now, a war memorial WikiProject should probably be made to park this stuff more permanently https://www.wikidata.org/wiki/User:Jane023/oorlogsmonumenten
Oct 18 2018
Mar 3 2018
Nov 15 2017
Ah I believe you are confusing the two issues - future software implementations and current backlog cleanup drives paired with reducing future backlog. Implementation of this will at least allow people to correct the stuff they upload (or have already uploaded with the default uploader).
This issue will not go away with Structured Commons, just to be clear on that. To help understand this issue, here is a mapping from the artwork template to Wikidata using an example from a 17th-century painting: https://commons.wikimedia.org/wiki/File:Commons_vs_Wikidata_for_PD-Art-100-1923_images.png
Nov 7 2017
I have come to understand (mostly through trial and error, and then asking around) that to reduce the time on the query you need to start with the thing that has the least number of items in the group you want to query. As these "sub groups" get bigger and bigger, I need to become more and more creative over time to split these sub-groups in order to make the same query on a yearly basis. It would be nice if these large "sub-groups" could have their own reporting instances in query-land somehow, so e.g. all women-items, all people-born-in-city items, all items-with-sitelinks-in-specific-project (language WP, language WS, Commons), etc.
Oct 25 2017
Oct 6 2017
Hmm that is nice, that doesn't seem to work too well for Rembrandt. I guess his paintings have traveled too far over time. I like the plot of the places of residence, but it would be nice to have them plotted with lines in order of residence. We have lots of people on Wikidata (not just painters) who unlike Rembrandt, were born in one country, lived in another country, and died in yet some other country. Sometimes location data helps to get these people onto the radar of various nationally-oriented wikiprojects. I didn't know that Descartes lived for a short while within bicycling distance of my house. It made me look at his work in a different way.
Oct 5 2017
Yes that map for Leiden looks interesting on a per-location basis. I really like the idea of connecting the dots however. It also gives people some motivation to type in the places artists lived and worked, rather than just stopping at the birth and death places.
Jul 16 2017
I think the problem that forms the basis of this task is not very well formulated. It might be worthwhile to convert this task into one of documentation on Wikidata and if people still feel strongly after understanding all the issues, then maybe link it to some strategy page on Meta. As far as I can tell the main reason this functionality is desired at all is due to notability concerns on various projects. Allowing a Wikidata sitelink to a Wikipedia redirect page would enhance the notability of that redirect (somehow) in that specific wiki project. Another idea was that using Wikidata information in Wikipedia infoboxes would be enhanced if this functionality was enabled. Notability for each wiki project is different from notability on Wikidata and I don't think Wikidata can or should even try to become some sort of notability indicator. From the other end of the spectrum, maybe the conversation we should be having is to eliminate all red links that don't link out to a Wikidata item. This would force page creators to go to Wikidata first, and that would have the added benefit that we would have a much smaller backlog of merges on Wikidata. Wikipedia redirect links could similarly all be converted into greenlinks that link to a Wikipedia page as well as a Wikidata item. This would have two benefits for the page creation process: 1) new pages would be properly linked on Wikidata and 2) users of greenlinks will be able to easily opt for the Wikidata item (which may be useful if different than the redirect target).
Jul 4 2017
Maybe there can be some communication on village pumps to this effect? Because I did notice that on some languages I got stuck in Wikipedia and in other languages I didn't. This must be because of local community gadgets that should be retired now there is a general extension (because the last message I had as a user was to use the settings wheel in the media viewer itself, but this won't work anymore because of the above-mentioned problems)
Jun 29 2017
Apr 11 2017
Mar 4 2017
Jan 28 2017
Jan 24 2017
Nov 5 2016
Yes I would drop the word "bias" then. It is about identifying and
measuring gaps, no? There should be a link somewhere to the bias side of
things (not sure where that is - diversity?)
Shouldn't his be split into biases and content gaps? The first is hard to measure (involves the community of editors personally - e.g. who they are groupwise, such as gender, nationality, age, education etc) and the second is a bit easier (coverage of geo-locations, topics per expert ontology, neutral POV for breaking news, etc.)
Jul 29 2016
Jul 12 2016
Jul 7 2016
Well I suppose we would need some sort of global map for Commons whereby any coordinates need to fit into some container category. I think it would be well worth doing, but I am not too sure whether it should be done on Commons or Wikidata.
Interesting! I have a Windows phone so I can't test it. If it is in the app I wonder if that is built-in functionality that comes with the app platform or if it can be converted to the default uploader, because lots of new cameras put the coordinates in the exif data these days
May 12 2016
May 9 2016
I have tried but have not been able to replicate this issue yet. It does appear to update the target item in real time or very close to real time.
Apr 28 2016
Interesting! Is there any way to alert when the user hits save that the wikidata item is or could not be updated? I mean I have in the past saved to a userpage as a test, so obviously linking that to the wikidata item would not work (I just saved under another name). Then when I move the article to mainspace, the original item should be shown again somehow. Maybe if the link is unsuccessful that the saved article should just include wikitext in the form of [[Category:Articles without Wikidata items]] or just <!---Original Wikidata item is Q9999--->. I suppose only the first is visible to the Visual editors though.
Apr 1 2016
Mundane actions I do on a regular basis is copying the same information from English Wikipedia to Commons and Wikidata (or any combination thereof). Some common statements on all three projects for the Sum of all Paintings project are 1) Creator name 2) Title of work 3) Collection 4) Inventory number in the collection 5) link to the artwork in the collection's website 6) Subject of artwork 7) Genre of artwork 8) Date of artwork. For more background look at [[Category:Paintings]] on English Wikipedia or Commons. See also Maarten Dammers work on adding Wikidata items to artwork templates on Commons and adding Commons images to Wikidata items about artworks.
Oct 22 2015
Thanks for spelling all that out. I had no idea the job queues on Commons could run up that high, that fast. Fascinating stuff and thanks for fixing the problem.
Oct 3 2015
Thanks for the link - I saw that page. Am I correct in assuming that they need a page under the list of shortcuts that states their local "bot policy" before making the request? I assume this is easily done by translating a similar project's bot policy page, but I am checking because that whole page is not clear to me at all.
"Basically you're setting up a precedent: small wikis can circumvent failed request to establish local 'crats by simply asking devs not aware of global practices to "boost" their sysops."
Yes I can imagine that for small or threatened language groups, that such requests may occur more often as we get better at providing data through Wikidata and input possibilities with font support.
Maybe if you gave a number of voters to shoot for? I am willing to vouch for anything Tulsi does, so I could go there and vote and get some friends to join me. I understand that though it may be silly, it could be a restriction of the mediawiki software? Or are people here really so distrusting of letting go of the bot controls (sorry, not a bot operator, so can't see the danger)
Hi all, after being messaged about this on facebook, can someone please sum up the status for the followers? As far as I can tell, the problem is that there is 1) no bureaucrats or 2) not enough votes. This is a very small community that wishes to jumpstart growth with an ambitious plan stated here: https://meta.wikimedia.org/wiki/Grants:IEG/Editing_Maithili_Wikipedia
Please help them get what they need, and if you are unable to do so, please explain what they need to do to move forward on this! It's unclear to me, and my english is way better than theirs.
Aug 14 2015
Getting back to the original comment, I think we should be thinking in terms of creating an "export data package" that can be used by any Wikimedia project or any external party. Think "infobox" for Wikipedia and "image template" for Commons. Each Wikimedia project should have its own import and export wizard for such data packages. In this scenario, the sizing of the image can be something that the consuming project can tweak based on their own "import data package" wizard. Magnus has sort of created an export data package with his "prepbio" tool, but it would be best if this was a two-step process, i.e. 1) export wikidata item to datapackage, followed by import datapackage to English Wikipedia (or to Dutch Wikipedia, or for painting items, to Wiki Commons artwork template, etc). The other way around it would be nice to have a Commons exporter to use for item creation on Wikidata
Jul 30 2015
There have been several RfC's on Wikidata for this issue, most recently this one (closed with no consensus):
https://www.wikidata.org/wiki/Wikidata:Requests_for_comment/Sitelinks_with_fragments
Jul 28 2015
No, the answer is not to create a sitelink to a redirect. The answer is to
overhaul the Kilowatt hour article and split it into an article mostly
about the history of the measurement and link it to a new short article on
Watt hour, which today is but a lowly redirect. Wikidata does not link to
redirects, because it doesn't know what to do with them as they have zero
data. Ask not what Wikidata can do for Wikipedia, but what can you as a
Wikipedian do for Wikidata.
Jul 27 2015
Try to think of Wikidata in the same context as Wikimedia Commons. A few
would probably agree with you that eventually every image will have its own
Wikipedia article. I am not one of those people.
Jul 24 2015
If you don't want items to be merged, you need the separate items to be
able to have sitelinks.
I see that I was unclear in my wording. I wasn't asking about a Wikipedia merge action but a Wikidata merge action. In your examples you are only referring to Wikipedia. The example you give on the football player probably comes closest to the person-duo problem, so I will spell that one out for you. The football player is a human and has a birth date and the squad is a team and has an inception date. These items do not match and should not be merged.
Jul 4 2015
The SoaP project augments existing Wikidata items about paintings that already exist, and creates items in a structured way for paintings that exist in the real world, but are not yet on Wikidata. Many paintings have items because some Wikipedia somewhere has an article about the painting, or because it is in one of the metadata runs that Maarten has been doing. As he creates a body of items based on top museums starting with the GLAMs who have already donated to Commons, I have experimented with two artists to include their body of work as documented by art historians. This is effective as a way to measure the way we model painting items on Wikidata, but also as a way of testing the "findability" of items (I have merged many doubles) and discovering collections for Maarten to "datamine". I started with Frans Hals, and applied the same concept to Pieter de Hooch a few months ago. The result for Hooch is a list on Wikipedia (en/fr/nl) built with the assistance of "Listeria" that links directly to Wikipedia painting articles or Wikidata painting items (in that order of preference) here: https://en.wikipedia.org/wiki/List_of_paintings_by_Pieter_de_Hooch
Jul 2 2015
We are already doing more than imagining GLAM applied to Wikidata, we are applying GLAM datasets to Wikidata at Wikidata:WikiProject sum of all paintings
https://www.wikidata.org/wiki/Wikidata:WikiProject_sum_of_all_paintings
We not only have Maarten who is busy adding painting items from the world's greatest collections, we also have collections who are eager to share data with us and "see what happens". The Rijksmuseum has given us their old catalog codes (from before 1976) in order to make matchups to art catalogs possible. They have also shared with us their iconclass codes on all of the artworks for which we have Wikidata items. These have been added in the property "P1256 Depicts iconclass notation".
May 29 2015
I find this bug *extremely annoying* and would like a work-around if it can't be fixed
May 12 2015
What you are describing is a Wikipedia merge action, which is indeed common on Wikipedia for various reasons. However, that does not indicate to me that there is any need for a corresponding merge action on Wikidata, as such a merge does not mean the Wikidata items need to be merged. In the case of a Wikipedia merge, the redirect is left "hanging" due to a bug, but in the normal course of things, this should be deleted from the associated Wikidata item. Oddly, what you are describing as an answer to my question is considered a feature by others. I call it a kludge to get what some here want, namely a Wikidata link to a redirect. My question still stands - why would you want to merge two distinct Wikidata items to each other, if one was a person and the other is a duo?
May 11 2015
Sorry, I can't think of a case where "these "complainers" would then also have to prevent any and all attempts to merge the articles back into whatever article they were split from". In my mind, we are facilitating the growth of articles with a minimum of interwiki-spagetti-ness, including splitting some very large Wikipedia articles into mobile-friendly shorter articles, not the other way around! Can you please provide a situation where you would want to merge a (hypothetical) "Bonnie" back into a (hypothetical) "Bonnie & Clyde" article? I am still not seeing the benefits of having this feature. I only see reasons against it.
Dec 28 2014
I cannot use the drop-down from my iPad1 so I disabled this feature. There was no way to update the user preferences to turn it off once I realized I couldn't get to my watchlist either. If the drop-down freezes, there is no workaround, even though I selected the desktop version at the bottom of the page.
Dec 10 2014
I am against implementing this so-called fix,because it would undermine what Wikidata is all about. Wikidata is meant to interlink same or "highly similar" concepts in different languages. As a language wiki expands, in the normal organic Wiki growth model articles get split into more concepts. It is perfectly normal for a short stub on "Bonnie and Clyde" to be lengthened and later even split out into 3 articles that include articles on Bonnie and Clyde. If in one language there is a book written about Clyde, it is possible that that language only includes Clyde, and so forth. Wikidata should not feel obligated to link all languages that have at least one of these three concepts, but only make direct links based on the items linked to individual articles. This is correct Wikidata behavior. I find it quite interesting by the way, that this whole conversation seems to be based on examples where the "redirect" case is always notable enough for its own article in any language. It would be more constructive if the complainers here would just go and create those articles instead of trying to create interlanguage redirects. On the subject of usability (which is what we are theoretically trying to improve anyway), there is a very tiny minority of mobile users who have ever even seen the button at the bottom of the screen to select a different language, so these redirects if implemented would never be used anyway.