As we develop the feature we must understand the parts of the whole and iterate on those parts.
The BreakingNews signal has an important dependency on templates. How and when templates are used by editors on new articles affects the latency and accuracy of the results our signal will produce. In order to more correctly understand how changes in the usage of templates can affect results, we must analyze (and potentially create) data on their usage.
This ticket may demand collaboration with product analytics. (Francisco to talk with Prod Analytics to see what is needed)
**To do**
- How many articles per day take each or more than one of the templates we use? (List of templates and categories are part of the POC)
- How many **//new//** articles per day take each or more than one of the templates we use?
- How much average time (//in minutes//) has elapsed between the creation of the article and the adding of the template?
- How much average time //(in minutes)// has elapsed between the template being adding and being removed?
- Which new articles receive which templates?
- Which new articles receive which templates that are later removed?
**Open Questions**
- What do we log this data already?
- Where is it?
- Can we create what does not exist?
**Acceptance Criteria**
What data needs to be collected to answer the questions above?
How are we going to store that data and make it available for BI tool?
- [TO BE EXPLORED WITH TEAM -- dashboard/spreadsheet/database]
Francisco to add a sample spreadsheet