Right now we ensure docker-pkg images are up to date when used in Blubber by ensuring:
a) we use no caching
b) we always pull the latest image.
Every week we rebuild all these base images with the --nightly switch of docker-pkg, it applies a new batch of apt updates and keeps the base images up to date.
On the other hand, it means when we rebuild child images to add new features, we rebuild based on the official changelog which might reference an older image that might or might not be broken, leading to nasty bugs like T344438 (gpg keys from the outdated base image had expired on the upstream repository which was fixed by forcing a rebuild).
I think there are three ways to solve this:
- We create a script that has permissions to commit to the production-images repo, and weekly runs docker-pkg update using the base images as starting point, then merges the changes. We will have very large changelogs eventually, but this will ensure consistency. Ideally, we move all this to CI when we move to Gitlab
- We modify docker-pkg so that it doesn't use the version in the changelog for the base image, but any version with a more recent nightly. This would remove some of the advantages of using something like docker-pkg
- We re-think docker-pkg as an application with a database where we keep track of dependencies and of which image tag was used to build the next one
I think by far the simplest and most effective way of solving this problem is the first option (rebuild images on a weekly basis to catch up with base images updates). I need to check with release engineering:
- How hard would it be to run docker-pkg in Gitlab's CI
- How hard would it be to grant a specific bot user the ability to push commits to the repository.