CI-built Debian packages built by the Gitlab runners will initially be available on a staging apt repository (to allow testing e.g.). Compared to our main repository the content of this staging repository will very relatively short-lived (since packages will be synced to the main repo or discarded). For an initial buildout:
- Create an apt-staging2001 host (we're fine without DC redundancy for the prototype and we have way more space in codfw since there's no WMCS and analytics there).
- We can fully reuse the component configs etc. from Puppet (and they'd be in sync for all changes as well), but will need some smaller Pupppet changes.
- Create a repo signing PGP key which is used by the CI runners to sign the .changes file
- Configure automated incoming processing (there's already an old task for it as T215812) and distribute the public key of the CI signing key via Puppet to the staging repo server so that it can validate the .changes files
Eventually when the whole system works and we've finetuned it, we can figure out how to fold the staging repo into the main repository server or whether we keep it in parallel. I think both approaches have their ups and downs, but let's figure that out when we have something running.
For the later progression from staging -> apt.wikimedia.org I'm thinking of something similar to puppet-merge, e.g. a command line "apt-merge bookworm component/ci jenkins-glue=1.23-1+wmf1", which prompts an SRE for confirmation and then syncs the packages to apt.wikimedia.org (and drops them from the staging repo).