New Wikidata builds (that get deployed to beta) are currently made manually. It would be really helpful to have an automated process for this, e.g. a jenkins job that makes a Wikidata build on a daily basis.
There is something that pushes to gerrit from jenkins@gallium, which is no option for Wikidata as we don't want to run composer there. Quick greping revealed nothing else.
My first idea was to use the same way as what should replace ssh agent forwarding for deployment (an ssh agent that the deployer, here jenkins job user, has access to but which is running as a different user) and generate that key by puppet and register the public key manually in gerrit. But this will not work on isolated jenkins jobs where the vm is created just for one job. Now those still need some way to store the build artifacts, but if found no mention of how that will work at https://www.mediawiki.org/w/index.php?title=Continuous_integration/Architecture/Isolation .
While I would prefer using Jenkins (i.e. our CI/build infrastructure to do builds), there are still two other ways: 1) a vm running for this purpose 2) a tool in tools-lab that is regularly run via cron.
I'm going for (1) as that is probably the least work and this is only needed until deployment to beta includes running composer. (After beta supports composer, builds are only needed for deployment branches which should be done manually before a submodule update.)