This task is to bootstrap a minimally working API, focused only on the build phases for now.
Things to keep in mind:
- Create skeleton repository (copy one of the ones we have already)
- Using FastAPI for quick POC (https://fastapi.tiangolo.com)
- Using standard toolforge component CI and helm chart
- Add to the api-gateway both on nginx config and the values.yaml (python app config)
Data structures
Responses
Every response should be wrapped in a messages response, see T356974: [builds-api,jobs-api,envvars-api,api-gateway] Figure out and document how to do non-backwards compatible changes
Config
For the config datastructure see T362070: [components-api] Get a minimal version of the config with build-only data, it should be stored in the tool namespace as read-only.
Only one per namespace.
Deployment
Example potential deployment data structure:
deploy-id: "mytool-deploy-datestamp-randint"
creation-time: "2024-04-04 12:00:00"
builds:
component1:
build-id: build-id-1234
component2:
build-id: build-id-1235It should be stored in the tool namespace as read-only.
Keep the latest 10.
Endpoints
| GET /api/v1/tool/<toolname>/config | Returns the current configuration if any or 404 if none found (returning it as-is, without filling up the defaults) |
| POST /api/v1/tool/<toolname>/config | Updates the configuration with the given one (storing it as-is, without filling up the defaults) - using the Config datastructure |
| POST /api/v1/tool/<toolname>/deploy | Creates a new deployment (Deployment datastructure) and starts all the builds from the stored configuration |
| GET /api/v1/tool/<toolname>/deploy/<deploy-id> | Updates the deployment (if not in terminal "finished" or "error" states) and returns the deployment info (see details below) |
| GET /openapi.json | Returns the openapi definition |
Note the <toolname> in the path, currently we will just check that the user (as passed in the ssl header) is the same as the tool in the path, but this opens allowing to deploy a tool from a different user once we have proper user auth.
Details on the call to POST /tool/<toolname>/deploy
The call to the endpoint should create a deployment data structure (see below), and return it's ID.
It should cleanup the oldest deployments if reaching the quota limit.
It should trigger all the builds in parallel, we will have to play with tekton on how it handles many parallel builds, but should not be an issue
Details on the call to GET /tool/<toolname>/deploy/<deploy-id>
This should update the Deployment datastructure in k8s and return the updated one.
We might want to implement a background process for the updating of the datastructure when needing to do actual job deployments, but we can wait until the api is more stable as it might require evaluating alternatives (https://fastapi.tiangolo.com/tutorial/background-tasks, celery, redis, tekton, ...).