User Story
As a Data Engineer, I want to document how web requests is logged, monitored and checked for data quality so that we can be transparent about what we have now and understand what we want to have
Questions to answer (please add more as needed):
- What are the processing steps involved?
- Per each processing step:
- What are the unit tests in place?
- What is logged during the process step?
- Where are the logs stored?
- Are the logs visualized? If so where?
- On a data set level
- What other data quality checks do we have?
- What alerts do we send? Where do they go? What system sends the alert?
- What are the upstream dependencies?
- How do we determine completeness of the dataset
Deliverable
Document with all the above information:
https://docs.google.com/document/d/1clSe6bnIxJUdd2LaFtQ-_MXGIRFBOzKNVZ2130qfGqI/edit