Page MenuHomePhabricator

Review KPIs, workflows, and scorecards of the Technical Collaboration team
Closed, ResolvedPublic

Description

The WMF quarterly review materials focus on team goals, but also includes KPIs, workflows, and scorecards. For T119387: Community Liaison and Developer Relation quarterly goals for January - March 2016 we have basically combined the data points Community Liaisons and Developer Relations had in their previous quarterly reviews, but we should review everything and decide what makes sense to keep and what is missing.

Also, we should have on-wiki versions with this data. Or, to be more precise, we should keep track of this data on-wiki on a regular basis, and then paste the information to our quarterly review slides.

Proposal

Community Liaisons

  • Percentage of Product and Technology goals requiring CL support that are resourced by the next quarterly check-in.
  • Beyond team goals, percentage of support requests submitted to CL by Product and Technology that receive a response within a week.
  • Beyond team goals, percentage of support requests accepted that are resourced within three months.
  • Number of unanticipated clashes with the communities related to goals or tasks where CL support has been committed, or where Technical Collaboration Guidance best practices have been followed by the Foundation teams (CLs must ensure that development teams are aware of potential and emerging points of conflict).
  • Number of wiki projects requesting beta features and early deployments (where applicable).
  • Percentage of top 25 wiki projects in terms of active contributors that have at least one active tech ambassadors and one translator identified (could be the same person).
  • Degree of satisfaction of teams receiving CL support or using TCG documentation, recorded via surveys and/or interviews.
  • Rates of major community collaborations by volunteers through surveys, to verify whether TCG expectations on best practices were met.

Developer Relations

  • Number of volunteers contributing code to Wikimedia repositories, total & recommended projects. (*)
  • Number of new volunteers contributing code to Wikimedia repositories, total & recommended projects. (*)
  • Percentage of retention of new developers 12 months after their first contribution, total & recommended projects. (*)
  • Number of software projects recommended for new developers, based on their ability to provide mentors, good entry-level documentation, first tasks, and a roadmap. (*)
  • Number of Wikimedia affiliates and partner organizations (through Wikipedia Education Program, GLAM...) involved in developer outreach activities.

(*) Recommended projects may include not only those going through Gerrit/Differential, but also Labs tools, bots, gadgets, Lua modules, and GitHub hosted.

(old) Candidates proposed

Quoting the Technical Collaboration team strategy (which might need review, see T131689).

  • Number of volunteers contributing code to Wikimedia projects / Community Wishlist projects.
  • Number of Community Wishlist projects completed by volunteers
  • Number of major Product discussions that the team supports.
  • Number of major Product discussions within the Technical Collaboration Guidance vs in conflict with it.

Other ideas to be discussed:

Event Timeline

Qgil raised the priority of this task from Low to Medium.Apr 20 2016, 7:23 AM
Qgil moved this task from Ready to Go to April on the Developer-Advocacy (Apr-Jun-2016) board.

A lot of my "management" time during this quarter has gone into annual plan, end of fiscal year, and hiring a developer advocate. Our next quarterly review in a couple of weeks will follow the same structure as the previous one. I'm pushing this task to the next quarter.

Qgil raised the priority of this task from Medium to High.Sep 2 2016, 10:16 AM
Qgil lowered the priority of this task from High to Medium.Oct 5 2016, 9:19 AM
Qgil moved this task from October to November on the Community-Relations-Support (Oct-Dec-2016) board.

I missed another quarterly review round. There is no point in discussing KPIs etc now in a rush.

Anyway, there are some efforts to improve quarterly reviews WMF-wide. Maybe jumping on that wagon is a better idea.

Note by @Elitre: if someone had participated to a conference and given a session, in which workflow should that be mentioned?

(I don't know where it would belong yet, but it sounds like an interesting metric for the entire dept/org?)

"Number of participation to community events" would be a good KPI, but is attending to such events in our scope? (To me it is a yes.)

Mere participation maybe not so much- although it may be yet another thing to track somewhere.

Qgil raised the priority of this task from Medium to High.Nov 11 2016, 2:06 PM

Ok, I'll try not to miss this train.

I had this fantasy about storing tabular data in Commons and then present it with graphs, on our wiki pages. Technically doable. A bit of work the first time, but then adding data once per quarter should be quite trivial.

Thanks to @Elitre, I am learning how the Team-Practices group uses light engagement surveys with the Foundation teams they work with.

This task is progressing indirectly thanks to the initial discussions about the Wikimedia Foundation Annual Plan program that will organize the support of Community Liaisons to Product. We will share this proposal publicly as part of the WMF process to share the annual plan.

We *might* discuss and decide on these KPIs etc before. We'll see.

I have captured in the description measurements that are being discussed as part of our annual plan drafts.

I believe we have enough ingredients in the Community Liaisons side.

@Aklapper @srishakatux @Rfarrand, we need to define the exact metrics that we are going to use to measure our efforts onboarding new developers.

Code review is a must (and Andre should propose what exactly are we measuring there) but it cannot be the only magnitude. We have other entry points that don't go through code review: Labs, Bots, Gadgets, Lua Modules, whatever happens in GitHub...

The good news is that the scope to measure are the recommended projects first. From the description:

Number of software projects recommended for new developers, based on their ability to provide mentors, good entry-level documentation, first tasks, and a roadmap.

If we nail down the metrics within this very controlled scope then we will have a very good starting point.

Quim, have the KPIs for CLs changed? I see a long list of things to consider, but no comments that look like a final decision.

I think those are the ones suggested for our future annual program, they're not "frozen", but they make sense with the kind of work we said we'd do.

Yes, the batch I added comes from our annual plan proposal, where targets and milestones are required. As Erica says, those are currently proposed and open to discussion.

Something I like about this situation is that these KPIs will become official on July 1 (together with the rest of the WMF Annual Plan) unless someone challenges them before. :)

Code review is a must (and Andre should propose what exactly are we measuring there)

I've created T160430 for that (which is a stub)...

If we nail down the metrics within this very controlled scope then we will have a very good starting point.

Generally speaking, https://wikimedia.biterg.io allows applying a filtered view of activity in certain code repositories (and sharing a static URL for that view).

[slightly offtopic]

We have other entry points that don't go through code review: Labs, Bots, Gadgets, Lua Modules, whatever happens in GitHub...

Regarding Lua modules, I asked anomie in 11/2016 "if you know of any way to get a list of the most active Lua template editors across WMF sites?"
His answer was "Not really. The best you might do would be to look at the top editors in the Module namespace on different wikis, although that might as well catch people who make lots of edits to some test module [...] or (on smaller wikis) people who copied/pasted modules from other wikis."

Qgil moved this task from May to June on the Community-Relations-Support (Apr-Jun 2017) board.

I think https://meta.wikimedia.org/wiki/Technical_Collaboration/Metrics is a good result of the review this task was intended for. Starting on July, we will start using the new metrics. Quarterly check-ins provide an unavoidable reminder to update them.

Fine tuning and improvements are still expected based on their regular use. Thank you to all contributors to this task, especially @Elitre and @Aklapper for their endless patience and attention to detail.