User Details
- User Since
- Jul 28 2015, 6:16 PM (227 w, 3 d)
- Availability
- Available
- LDAP User
- Unknown
- MediaWiki User
- Jgbarah [ Global Accounts ]
Sep 3 2017
Any news when to see https://wikimedia.biterg.io/app/kibana#/dashboard/C_Gerrit_Demo deployed 'by default'?
And way more important (as this blocks T167085), is it possible to update the data? Currently the latest "new authors" entry is from July 13th and we need recent names...
Apr 13 2017
Mar 23 2017
Nov 24 2016
I guess we better close this task. Now, data retrieval for MediaWiki is done with a Perceval backend, which probably gets all the needed information.
Feb 29 2016
Hi. Sorry for the delay in answering.
Dec 20 2015
We would like to follow your plans to migrate to Differential as much as possible. During the last monthly meeting, @Aklapper mentioned (if I'm not wrong) that this was very unlikely to happen during the next three or four months. Given this schedule, it is likely that from the upstream point of view, we support this with the new "version" of our tools, which is starting to be published during the next weeks.
Nov 11 2015
I finally ranked my entry. Sorry for the delay.
Nov 8 2015
Just a minor note. From the point of view of the historical information, what is needed is that "closing transaction" (in Bugzilla, a change of state to closed) with the date of the closing action. That allows for calculating the backlog at any snapshot time in the past, for example.
Nov 6 2015
Finally I didn't have good candidates for the other project, so I resigned and applied to a be a mentor for Wikimedia. Just waiting for approval.
Nov 2 2015
Oct 31 2015
It seems I cannot apply for being a mentor for WikiMedia, because I already applied as a mentor for Xen (I'm participating in their program as well). Anybody know to whom this can be reported?
Oct 19 2015
Oct 17 2015
Oct 15 2015
Yes, if I remove print (pageid+" "+ namespace + " " + title+" was already in the db") and replace it with print (pageid+" was already in the db"), I don't get this problem. The problem is with the encoding of the title, I suppose.
BTW, to easy the process of reviewing the code, maybe you can fork the MediaWikiAnalysis GitHub repository, point me to your fork, and I just clone it. That way I can follow your changes more easily.
In fact, after a while, I get an error with your code:
I've run your code, and I don't see that unicode error you see:
Oct 14 2015
I guess we've worked on that in the past. It seems and I think it is already fixed. Checking the code in GrimoireLib:
Oct 12 2015
Oct 7 2015
Oct 6 2015
Proposal for GSoC 2015, by Sarvesh Gupta, moved to this comment, to make the description more clear, while preserving the interesting analysis and planning, that maybe somebody can use as inspiration.
Oct 4 2015
Oct 3 2015
Oct 2 2015
This one needs some knowledge about Python, and will require you to learn about SQLAlchemy. Fortunately, there is plenty of documentation about SQLAlchemy, and the tasks needs little analysis: it is doing the same as the tool does now, but using SQLAlchemy.
Oct 1 2015
Sep 29 2015
Sep 20 2015
Sep 2 2015
Sep 1 2015
From the list in Group Non-Interactive Users, I tagged gerritpatchuploader@gmail.com, yuvipanda+suchabot@gmail.com, and wikidata-services@wikimedia.de as bots. I couldn't find jdlrobson+frankie@gmail.com in our database. Maybe it didn't act?
Added Contributors | Bots subsection at the Community Metrics wiki with this info.
The list of bots is currently maintained in the Sorting Hat (identities) database, in table profiles. If the field "is_bot" is 1, the identity is considered a bot. Otherwise, that field is 0.
Aug 28 2015
I see three issues here:
Aug 27 2015
We've removed a bunch of organizations (see below), but I'm going to explain the situation, just to try to be on the same page.
After looking at the available information, the current list of identities labeled as "bot" is:
Jul 31 2015
Dani is on vacation these days, and he is the most knowledgeable about those PKIs. We are already working on this, anyway. In particular, we're trying to see what's happening with T103292. We'll keep you informed.
Jul 30 2015
In fact, the main reason would be simplifying (and making it more accurate) the retrieving of historical stats of almost any kind. For example, if at some point you're interested in studying the historical evolution of time-to-close, those closing records will be needed.
Jul 29 2015
OK, waiting for @chasemp comments, if any.