Thanks for all of the assistance!
Mon, Jul 26
Our rough plan at the moment is to puppetize installation of gitlab-runner on nodes, and leave a manual configuration step for registering each as a Docker executor.
Fri, Jul 23
Thu, Jul 22
Wed, Jul 21
Tue, Jul 20
See Draft: Add tools for configuring all projects. Needs a bit of work, but presumably we can run configure-projects as a periodic job.
Since we're currently using a value of cas3, I'm assuming this was just an error on my part.
Mon, Jul 19
Multiple-value settings: https://gitlab.wikimedia.org/releng/gitlab-settings/-/merge_requests/1
Fri, Jul 16
Working on a patch; it's a bit fiddly because the omnibus config wants to just jam the value of nginx[log_format] inside a single-quoted string, and I'm not sure if it allows for setting the escape, but there's probably some workaround.
Ah cool gotcha... on that point, yes, if it'd be possible then to publish the images from these merged changes, that'd be fantastic:
Ok, it looks like we just need to set nginx['log_format'] to that value in /etc/gitlab.rb.
(Apologies - above log message is an error on my part, was attempting to reuse that log line for a different change.)
Is it possible to configure the nginx component to emit ECS-compatible access logs natively?
Thu, Jul 15
Wed, Jul 14
Published successfully I think - let me know if it's not working.
it's been 9 of them in the last 3 hours, and they're happening on wmf.12 as well. Does this actually constitute a train blocker?
Still seeing instances of this since the most recent patch was deployed:
Tue, Jul 13
Frequency seems down, but @dancy just pointed out ~3 of these in the last hour.
Note: During sync, you're likely to see something like:
Should this be backported to both wmf.12 and .14?
Mon, Jul 12
Noting that, given the rate of these, I probably observed a handful while doing log triage during 1.37.0-wmf.12 (T281153), and mentally filed as "typical low-level database error". I could use a better working model of which db errors represent a code regression (and might be worth documenting a bit for other deployers).
Sun, Jul 4
Backport deployed, confirmed that enwiki AbuseLog now shows diff links for new edits.
Doing an emergency deploy shortly for T286140.
I can deploy; going ahead as I think it will be clear from reproduction steps above if the fix worked.
Fri, Jul 2
Thanks for digging into the layout of this so thoroughly.
Port is open and in use.
Does GitLab 2fa work in conjunction with LDAP?
Many thanks to @Jdforrester-WMF et al. for handling of emergency deploys during the US night.
Thu, Jul 1
If we're bikeshedding the naming, I don't have strong feelings here, but "gitlab-runners" does align exactly with the name of the software we intend to run.
Confirmed fixed on mwdebug2001; synching.
Noticing some broken styles in links at https://commons.m.wikimedia.org/wiki/Main_Page
Rolling back to group0 for T285951, per IRC discussion.
It looks like T285951: Some section links in search results are redlinks is a regression in wmf.12. I don’t know if it’s significant enough to block the train.
Wed, Jun 30
End-of-US-day status: Things are looking pretty stable on group1, expect to proceed as usual to all wikis tomorrow.
This is now at https://gitlab.wikimedia.org/releng/gitlab-settings/ - GPL3 license in repo.