Deployment of the Maniphest backend to offer at least the information available at http://korma.wmflabs.org/browser/maniphest.html
Description
Status | Subtype | Assigned | Task | |
---|---|---|---|---|
· · · | ||||
Resolved | None | T137997 Visualization/data regressions after moving from korma.wmflabs.org to wikimedia.biterg.io | ||
Resolved | Aklapper | T28 Decide on wanted metrics for Maniphest in kibana | ||
Resolved | Lcanasdiaz | T138002 Deployment of Maniphest panel | ||
Resolved | Lcanasdiaz | T138003 Maniphest support to be added to GrimorieLab | ||
· · · |
Event Timeline
This new panel should be published within the next 7 days. I'm asking the devops team to get confirmation.
Guys, we are ready to deploy this but we are finding some issues with the data collection. We do not manage to download more than 600 tickets. Do you have a limitation or do we need a special token? We named it "maniphestbot"
@Lcanasdiaz: Any pointers to the code / API call (I assume you use maniphest.info)?
What exactly happens after downloading 600 tasks? Any errors? Timeouts?
@Aklapper we are getting HTTP 503 Errors - Service unavailable.
I've been testing our tool and we get this error from time to time, not after 600 tasks were fetched. Sometimes it happens before and sometimes after. I could download 1043 tasks with the same process once but most of the times the range is in between 300 and 600.
Are we overloading the server? We are sending 3 HTTP request per second at most.
@mmodell: Any idea if that's a sane value when it comes to accessing Phab / iridium via the Phab API?
Sorry. I have emailed @mmodell a few days ago asking him for input here. Waiting for a reply here... :-/
Thank you @Aklapper ! We could deploy it next with (even with a subset of the data) so you can play with it. It is saving a lot of time here in Bitergia filtering tickets so I hope it is going to be also useful for you guys.
@Lcanasdiaz: 3 http requests per second doesn't sound too unreasonable, however, how many tasks are you fetching in a single batch? You should make your code fetch small batches (or even one task at a time?) and retry the request when you receive a 503. It shouldn't be too difficult to recover from the errors and continue. With variable load on our database and a strict execution time limit there will always be an occasional 503 when the service is overloaded.
@Lcanasdiaz: Did Mukunda's comment help?
(Or is there some issue that after a 503 you might need to start from scratch again and hence miss data? If so, more info welcome.)
@Lcanasdiaz: Did Mukunda's comment help?
(Or is there some issue that after a 503 you might need to start from scratch again and hence miss data? If so, more info welcome.)
@Aklapper Data collection is already done, we are polishing up the enrichment process (where raw data is converted to enriched data), so the panel will be deployed during the next days.
Note to myself: Once this is deployed, compare to https://rust-analytics.mozilla.community/app/kibana#/dashboard/GitHub-Issues-Timing and https://rust-analytics.mozilla.community/app/kibana#/dashboard/GitHub-Issues (which is GitHub though) and https://eclipse.biterg.io/app/kibana#/dashboard/Issues-Backlog and https://eclipse.biterg.io/app/kibana#/dashboard/Issues
@Aklapper we've updated the dashboard today adding finally the information of Maniphest. You can check it in:
https://wikimedia.biterg.io/app/kibana#/dashboard/Maniphest
https://wikimedia.biterg.io/app/kibana#/dashboard/Maniphest-Backlog
https://wikimedia.biterg.io/app/kibana#/dashboard/Maniphest-Timing
@Lcanasdiaz feel free to close the task
Yay! Thanks and congratulations! Let's close this. :)
I might report some followup bugs (but so far I've only found T161519). :)
I'd have multiple feature requests depending on what's the expected purpose of this statistical tool (I think the main purpose should be to compensate the shortage of reporting / statistics / charts within Phabricator itself compared to e.g. bugzilla), but for now I reported a bug: T161682.
Having basic (and if needed specific) statistics about developer activity in the Wikimedia community. :)