Learn more about me at https://en.wikipedia.org/wiki/User:Sadads
- User Since
- May 12 2015, 4:30 PM (140 w, 3 d)
- IRC Nick
- LDAP User
- MediaWiki User
Jun 22 2017
Jun 13 2017
@Doc_James: as its been explained to me, there is no timestamp or queue of
those changes to the "file usage" list -- so its actually a bit more
complex question of logging those changes in a way that isn't super
intensive, one way or another.
Jun 12 2017
Feb 14 2017
The display of the metadata from Wikisource is kindof ugly in that preview: its doing both the title and the file name, and some other stuff.
Jan 30 2017
@Nuria the challenge has been that @Legoktm did a good start, but capacity is not there for sustained focused work. If you can help with connecting someone to solving the problem, that would be awesome.
Jan 5 2017
Does this run into some of the same problems as https://meta.wikimedia.org/wiki/Community_Tech/Edit_summary_length_for_non-Latin_languages ?
Dec 22 2016
It might be worth making it a user option for the wikitext editor.
@AlexMonk-WMF for the 2017 wikitext editor. I can think of a number of circumstances, where my typing in the the wikitext editor, has created a bad template, slow down or interfering with the accuracy of my contributions.
Ahaha, it looks like we need to figure out what the shortcut will be and how it will respond to editors. Thank you for exploring this.
Dec 15 2016
Hi, I saw that this was supported as part of the 2016 Community Wishlist. I wanted to note that there are other applications of data about when images are used, under which revision, including but not limited to, notifying folks of deletion discussions if they have used the media file in their own projects, tracking if mass uploads have been used by users, where the "uploader" may be an institution or bot operator, not the creator themselves, etc. I have outlined a bit more robust way of tracking that kind of data at: T137758 , which could then be used to populate these kinds of notifications.
Dec 7 2016
@Cyberpower678 might want to be tagged on this one.
Dec 2 2016
Sep 28 2016
Yeah, I think I can hack around this for the time being: use the pagepile
to create a report.
Sep 27 2016
@Ottomata: would https://lists.wikimedia.org/pipermail/wikitech-l/2016-September/086617.html be an appropriate strategy for tracking changes in media files?
Sep 23 2016
Ah good, let us know if we can help.
Sep 22 2016
Sep 20 2016
Sep 16 2016
Also, another tool that tracks usage of files across wikis: https://tools.wmflabs.org/glamtools/glamorous/ . I think it queries directly to the table that documents reuse in commons.
@Nuria This isn't primarily an outreach.wikimedia problem: most of the files used for GLAMMorgan end up on other community projects, and the tool is creating a ceiling of some sort on the number of requests it runs. I wonder, perhaps, if it has to do with running individual page stats rather than in batches, etc.
Sep 14 2016
Depending on how the data is stored, I could also see people looking at a set of articles on a Wiki, and wanting to look at the file change edits (but that might be a different set of data.)
I think the main user story is something like:
*GLAM donates large number of content items to commons
*GLAM uses a category or pagepile of those Files (or another subsection of content) in a list, and wants to know "Who/When/How" they changed usage of the file
*Plug into tool that set that needs to be searched,
*Get report, so that they can reach out, reward or engage this group
Sep 12 2016
Theoretically, this could be done rather robustly per @Mvolz , once we start sharing structured metadata described per T68108 . @Lydia_Pintscher & the Wikidata team probably will have a better sense if that is tangible -- but if they think structured data on commons is likely in the next year or two, I would not rush into building the Dark Archive, because semi-automatic function are much easier at that point.
@MusikAnimal do you have a sense if there is a limit on the API or the computing for this, that would be prohibitive of this kind of opening up of the tool?
Sep 9 2016
good to know, that delays one of my projects then: which is fine, it looked
like it might have been too early in next quarter anyway, Alex
Sep 8 2016
@Nuria Cool! Is there a timeline for this: next couple sprints?
Sep 7 2016
Thanks for making this a phabricator item! Looking forward to the update on how this works!
Sep 1 2016
Aug 27 2016
+1 to @ManosHacker 's strategy on this: the only real use cases are in bulleted or in number lists.
Jul 26 2016
Finally! Yay! Super excited!
Thanks @Nuria thats good to know: community programs and events could
really use the data coming off these wikis.
Jul 13 2016
Jul 11 2016
@kaldari & @Niharika Currently OCR uses external tools to do on depend on Tesseract OCR (see the list of languages at https://github.com/tesseract-ocr/langdata) , when the HOCR text is not available in the object itself (PDFs and/or DJVUs will carry HOCR text with them if available). https://wikisource.org/wiki/MediaWiki:OCR.js .
What are the languages that are currently unsupported by the OCR on WikiSource?
Jul 8 2016
Jul 7 2016
As a followup to the comment on T132088#2360795 -- @Catrope we talked to @Quiddity and @jmatazzoni about creating a notification to help encourage folks to get Wikipedia Library access. Do you have a sense of when this could get done?
Jul 6 2016
Sounds great! Thats a bit hacky, but it would make a huge difference for
those of us working in metaspaces
Jul 5 2016
@Multichill yes theoretically any map, but really the stuff that we have a whole lot of data for (paintings for instance, cultural heritage). The idea would be to find some way for the map or graph created by a volunteer, of educational value, out into other websites, theoretically, also with the added value of Wikidata queries or snapshots from Wikidata queries.
Mostly solved with massviews, better conversation/bug at https://phabricator.wikimedia.org/T135437
Sounds good! Thanks so much!
Jul 1 2016
Thanks @Danny_B I moved that recommendation to https://www.mediawiki.org/wiki/Phabricator/Creating_and_renaming_projects , that way newbies can do it easily
Jun 30 2016
Jun 20 2016
Jun 17 2016
Jun 14 2016
Might be dependent, in part, on T60698
Jun 13 2016
Jun 10 2016
@Niharika and @Tshrinivasan I will be at the hackathon, so would love to talk to you about the code and problem being solved by this bug at Wikimania -- I want to make sure the conversations we start help you in this work.
Jun 8 2016
@Tshrinivasan I missed the change sorry for not following up sonoer. I am looking into it: we (WMF) may actually be able to negotiate something with Google -- I think they offer access to it as part of their cloud computing services: https://cloud.google.com/vision/
Jun 7 2016
@Catrope that would be awesome if you can give it a try. We are hoping to start work with Collaboration sometime early next quarter, and want to make sure we understand what we are asking them to give notifications to.
May 19 2016
Thanks for advocating for us Danny! The new graphs are awesome :)
I would think standard enabling it across the board, is going to be important, both for using it as a tool for tracking, and as it becomes available feeding the data into other tools, etc.
@Magnus any thoughts on this?
May 13 2016
@MusikAnimal I have been getting "langview" errors for Pagepile 3053 .
@Nuria do we have a sense of when this will happen? There are a fair number of dependencies on having this data available.
May 12 2016
@Tshrinivasan out of curiosity, would access directly to the Google OCR tools through Google be better than the current architecture, which requires routing content through Google Drive? And, are the Indic languages the only ones that would benefit from access to the OCR suite?
May 10 2016
In the short term, we have been working with the developer of the Google Spreadsheets Wikipedia tool, namespace agnostic, so its super easy to grab non-article namespace -- not quite a pagepile, but fairly dynamic as a short term fix (we have a demo at https://docs.google.com/spreadsheets/d/1hUbMHmjoewO36kkE_LlTsj2JQL9018vEHTeAP7sR5ik/edit With Wikipedia Library project pages).
@Legoktm and @kaldari do we have a sense of why the data isn't logging yet? @Milimetric tried to check the log for the event, and wasn't able to find collection. Perhaps we are looking at the wrong location? The typo in "ExtenalLinksChange" still carried through, so it ought to be "ExternalLinksChange".
@Samwalton9 might be interested as well.
May 9 2016
@Neil_P._Quinn_WMF , @Halfak and @Capt_Swing sorry for not following up sooner: we just need to know how many people got past the criteria of 500 edits and 6 months of activity in the last 3-4 months, so that we know what the trend is. Thanks @egalvezwmf for getting this started.