Context
Tech Community Building Key Result
Users can assess the reliability of tools for adoption, contribution, or research based on a system of quality signals (co-maintainers, docs, recent editing usage, published source code, endorsements, etc) within Toolhub.
The outcome of this spike is to be able to identify which metrics would be feasible to expose on a per tool basis. The acceptance criteria define which metrics we want to investigate, not necessarily to implement.
By each AC, please state either "Yes" or "No" if the metric is available to us to expose. If "Yes", showing an example of the data or a short explanation of how to provide it.
Acceptance Criteria
On-wiki
- User Scripts
- Gadgets
- Lua Modules
- Templates
Off-wiki
- Web Services
- “Bots” (Software that makes edit changes)
- Usage
- Supported Wikis
- Number of edits applied
- Usage
- Desktop Apps
- Native Mobile Apps
- Number of unique users per tool
- CLI
- Number of unique users per tool
- Coding Framework
- Number of unique users per tool