Page MenuHomePhabricator

Package and deploy Varnish 6.0.9
Closed, ResolvedPublic


Varnish version 6.0.9 is out. The new version includes the following changes:

  • Increase the default stack size to 64k. (3617)
  • Correctly reset the received SIGHUP flag in libvarnishapi when no handling function for this signal is registered. (3437)
  • Make it possible to set the sess.timeout_idle VCL variable also when using VCL syntax version 4.0. (3564)
  • New varnishstat counter MAIN.esi_req. This increments for each ESI subrequest that is made.
  • The CLI command -v now outputs the builtin VCL source last, after any included VCL sources files.
  • Improve the ability of log utilities to detect log overruns. (3716)
  • Add an option to configure to use libunwind for the stack backtrace in the panic output. (3717)
  • A bug in vmod_blob for base64 decoding with a length argument and non-padding decoding has been fixed (3378)
  • The socket option inheritance checks now correctly identifies situations where UDS and TCP listening sockets behave differently, and are no longer subject to the order the inheritance checks happens to be executed (3732).
  • IPv6 listen endpoint address strings are now printed using brackets.

Among the changes, libunwind support for back-traces seems particularly interesting. It requires some minor packaging change. I think just adding libunwind to Build-Depends should be sufficient for the build system to autmoatically use it or at worst we'll have to pass --with-unwind to configure.

Follows-up: T292290: Package and deploy Varnish 6.0.8

Event Timeline

ema triaged this task as Medium priority.Jan 7 2022, 9:53 AM

Change 752151 had a related patch set uploaded (by Ema; author: Ema):

[operations/debs/varnish4@debian-wmf] Use libunwind for backtraces

Change 752151 had a related patch set uploaded (by Ema; author: Ema):

[operations/debs/varnish4@debian-wmf] Use libunwind for backtraces

Configure is called as follows with the above patch:

./configure --build=x86_64-linux-gnu --prefix=/usr --includedir=\${prefix}/include --mandir=\${prefix}/share/man --infodir=\${prefix}/share/info --sysconfdir=/etc --localstatedir=/var --disable-silent-rules --libdir=\${prefix}/lib/x86_64-linux-gnu --runstatedir=/run --disable-maintainer-mode --disable-dependency-tracking --localstatedir=/var/lib --with-unwind

And indeed later on unwind is passed to the linker:

/bin/bash ../../libtool  --tag=CC   --mode=link gcc -DNOT_IN_A_VMOD -DVARNISH_STATE_DIR='"/var/lib/varnish"' -DVARNISH_VMOD_DIR='"/usr/lib/x86_64-linux-gnu/varnish/vmods"' -DVARNISH_VCL_DIR='"/etc/varnish:/usr/share/varnish/vcl"' -DUNW_LOCAL_ONLY -I/usr/include/x86_64-linux-gnu -g -O2 -fdebug-prefix-map=/build/varnish-6.0.9=. -fstack-protector-strong -Wformat -Werror=format-security -pthread -Wall -Werror -Wno-error=unused-result -export-dynamic -Wl,-z,relro -o varnishd varnishd-cache_acceptor.o varnishd-cache_backend.o varnishd-cache_backend_probe.o varnishd-cache_ban.o varnishd-cache_ban_build.o varnishd-cache_ban_lurker.o varnishd-cache_busyobj.o varnishd-cache_cli.o varnishd-cache_deliver_proc.o varnishd-cache_director.o varnishd-cache_esi_deliver.o varnishd-cache_esi_fetch.o varnishd-cache_esi_parse.o varnishd-cache_expire.o varnishd-cache_fetch.o varnishd-cache_fetch_proc.o varnishd-cache_gzip.o varnishd-cache_hash.o varnishd-cache_http.o varnishd-cache_lck.o varnishd-cache_main.o varnishd-cache_mempool.o varnishd-cache_obj.o varnishd-cache_panic.o varnishd-cache_pool.o varnishd-cache_range.o varnishd-cache_req.o varnishd-cache_req_body.o varnishd-cache_req_fsm.o varnishd-cache_rfc2616.o varnishd-cache_session.o varnishd-cache_shmlog.o varnishd-cache_tcp_pool.o varnishd-cache_vary.o varnishd-cache_vcl.o varnishd-cache_vcl_vrt.o varnishd-cache_vrt.o varnishd-cache_vrt_priv.o varnishd-cache_vrt_re.o varnishd-cache_vrt_var.o varnishd-cache_vrt_vmod.o varnishd-cache_wrk.o varnishd-cache_ws.o varnishd-common_vsc.o varnishd-common_vsmw.o varnishd-hash_classic.o varnishd-hash_critbit.o varnishd-hash_simple_list.o varnishd-mgt_hash.o varnishd-vhp_decode.o varnishd-vhp_table.o varnishd-cache_http1_deliver.o varnishd-cache_http1_fetch.o varnishd-cache_http1_fsm.o varnishd-cache_http1_line.o varnishd-cache_http1_pipe.o varnishd-cache_http1_proto.o varnishd-cache_http1_vfp.o varnishd-cache_http2_deliver.o varnishd-cache_http2_hpack.o varnishd-cache_http2_panic.o varnishd-cache_http2_proto.o varnishd-cache_http2_send.o varnishd-cache_http2_session.o varnishd-mgt_acceptor.o varnishd-mgt_child.o varnishd-mgt_cli.o varnishd-mgt_jail.o varnishd-mgt_jail_solaris.o varnishd-mgt_jail_unix.o varnishd-mgt_main.o varnishd-mgt_param.o varnishd-mgt_param_bits.o varnishd-mgt_param_tbl.o varnishd-mgt_param_tcp.o varnishd-mgt_param_tweak.o varnishd-mgt_pool.o varnishd-mgt_shmem.o varnishd-mgt_util.o varnishd-mgt_vcc.o varnishd-mgt_vcl.o varnishd-cache_proxy_proto.o varnishd-mgt_stevedore.o varnishd-stevedore.o varnishd-stevedore_utils.o varnishd-storage_file.o varnishd-storage_lru.o varnishd-storage_malloc.o varnishd-storage_simple.o varnishd-storage_umem.o varnishd-cache_waiter.o varnishd-cache_waiter_epoll.o varnishd-cache_waiter_kqueue.o varnishd-cache_waiter_poll.o varnishd-cache_waiter_ports.o varnishd-mgt_waiter.o  varnishd-VSC_lck.o varnishd-VSC_main.o varnishd-VSC_mempool.o varnishd-VSC_mgt.o varnishd-VSC_sma.o varnishd-VSC_smf.o varnishd-VSC_smu.o varnishd-VSC_vbe.o varnishd-builtin_vcl.o ../../lib/libvcc/libvcc.a ../../lib/libvarnish/libvarnish.a ../../lib/libvgz/libvgz.a -ljemalloc -lpcre -ldl   -lnsl  -lrt  -lm -lunwind

Looking good!

Mentioned in SAL (#wikimedia-operations) [2022-01-07T14:05:40Z] <ema> upgrade varnish on deployment-cache-text06 to 6.0.9 T298758

Change 752151 merged by Ema:

[operations/debs/varnish4@debian-wmf] Use libunwind for backtraces

Change 752153 had a related patch set uploaded (by Ema; author: Ema):

[operations/debs/varnish4@debian-wmf] Release 6.0.9-1wm1

Smoke testing of 6.0.9 is fine on deployment-prep, I'll start upgrading production nodes next week.

Change 752153 merged by Ema:

[operations/debs/varnish4@debian-wmf] Release 6.0.9-1wm1

Mentioned in SAL (#wikimedia-operations) [2022-01-10T16:52:14Z] <ema> varnish 6.0.9-1wm1 uploaded to buster-wikimedia - component/varnish6 T298758

Mentioned in SAL (#wikimedia-operations) [2022-01-11T09:23:50Z] <ema> cp4021 (upload), cp4027 (text): upgrade varnish to 6.0.9-1wm1 T298758

Mentioned in SAL (#wikimedia-operations) [2022-01-13T10:02:47Z] <mmandere> cp3052: upgrade varnish to 6.0.9-1wm1 T298758

Mentioned in SAL (#wikimedia-operations) [2022-01-13T14:56:41Z] <mmandere> cp3053: upgrade varnish to 6.0.9-1wm1 T298758

We,ve analyzed cp3052 and cp3053 (text and upload nodes respectively) and compared the following resources

  • Cache hits
  • Request Rates
  • CPU usage
  • P75TTFB (Time to First Byte)

    against other cache nodes (that run the older version of varnish 6.0.8 of the same cluster in the same datacenter. The analysis has shown no anomalies in the aforementioned resources as they are comparable and no significant improvement noted as well.

With this information, we therefore can proceed to upgrade varnish from version 6.0.8 to version 6.0.9 in all our cache instances in every datacenter.

Mentioned in SAL (#wikimedia-operations) [2022-01-18T11:06:28Z] <mmandere> start rolling upgrade to varnish 6.0.9 T298758

Mentioned in SAL (#wikimedia-operations) [2022-01-19T14:33:18Z] <mmandere> esams: upgrade varnish to 6.0.9 T298758

Mentioned in SAL (#wikimedia-operations) [2022-01-19T15:40:58Z] <mmandere> cp5005,cp4025: upgrade varnish to 6.0.9 T298758

MMandere claimed this task.

We now have varnish upgraded from 6.0.8 to 6.0.9 in all our cache instances (across all datacenters). Everything is working fine and so far no known reported issue arising from the upgrade. We'll, however, continue monitoring but for now we'll go ahead and have the task marked as resolved.