Page MenuHomePhabricator

[SPIKE] Identify potential metrics which could be computed for tools
Open, Needs TriagePublicSpike

Description

Context

Tech Community Building Key Result
Users can assess the reliability of tools for adoption, contribution, or research based on a system of quality signals (co-maintainers, docs, recent editing usage, published source code, endorsements, etc) within Toolhub.

The outcome of this spike is to be able to identify which metrics would be feasible to expose on a per tool basis. The acceptance criteria define which metrics we want to investigate, not necessarily to implement.

By each AC, please state either "Yes" or "No" if the metric is available to us to expose. If "Yes", showing an example of the data or a short explanation of how to provide it.

Acceptance Criteria

On-wiki

  • User Scripts
  • Gadgets
  • Lua Modules
  • Templates

Off-wiki

  • Web Services
  • “Bots” (Software that makes edit changes)
    • Usage
      • Supported Wikis
      • Number of edits applied
  • Desktop Apps
  • Native Mobile Apps
    • Number of unique users per tool
  • CLI
    • Number of unique users per tool
  • Coding Framework
    • Number of unique users per tool

Event Timeline

Restricted Application changed the subtype of this task from "Task" to "Spike". · View Herald TranscriptWed, Nov 24, 8:35 PM
Restricted Application added a subscriber: Aklapper. · View Herald Transcript

Notes from our conversation in Toolhub Sync:
All on-wiki tool-types exist within MediaWiki deployment.

Attributes

  • Who (users, types of user roles, etc.)
  • What (Category, purpose)
  • Where (Projects, geographic)
  • How

Scope

  • Programmatically available to aggregate
bd808 renamed this task from [SPIKE] Identify & Expose Tool Metrics to [SPIKE] Identify potential metrics which could be computed for tools.Wed, Dec 1, 11:05 PM