Page MenuHomePhabricator

upgrade facter and puppet across the fleet
Closed, ResolvedPublic

Description

We should upgrade the puppet code base so that it works with facter v3. once this is in place we should upgrade all systems with backported packages for puppet and facter

puppet version: 5.5.10-2
facter version: 3.11.0-1.1+b1

Details

Related Gerrit Patches:
operations/puppet : productionSwitch puppetdb1001/1002 to facter 3/puppet 5
operations/puppet : productionSwitch deployment-prep to facter 3 / puppet 5
operations/puppet : productionfacter3/puppet5: clean up old config
operations/puppet : productionpuppet5/facter3: ensure puppet master infrastructre is not upgraded
operations/puppet : productionpuppet5/facter3: ensure puppet master infrastructre is not upgraded
operations/puppet : productionfacter3/puppet5: enable puppet5/facter3 eqiad
operations/puppet : productionfacter3/puppet5: enable puppet5/facter3 codfw
operations/puppet : productionfacter3/puppet5: enable puppet5/facter3 ulsfo
operations/puppet : productionfacter3/puppet5: enable puppet5/facter3 esams
operations/puppet : productionfacter3/puppet5: enable puppet5/facter3 eqsin
operations/puppet : productionpuppet5/facter3: update canary
operations/puppet : productionfacter3/puppet5: update interface fact parsing
operations/puppet : productionpuppet5/facter3: update canary
operations/puppet : productionpuppet5/facter3: update canary
operations/puppet : productionpuppet5/facter3: update canary
operations/puppet : productionpuppet5/facter3: update canary
operations/puppet : productionpuppet5/facter3: update canary
operations/puppet : productionpuppet5/facter3: update canary
operations/puppet : productionpuppet5/facter3: Revert upgrade until interfaces fact fixed
operations/puppet : productionpuppet5/facter3: update canary
operations/puppet : productionfacter3/puppet5: downgrade canaries
operations/puppet : productionfacter3/puppet5: upgrade canaries
operations/puppet : productionfacter3/puppet5: upgrade canaries
operations/puppet : productionfacter3/puppet5: add version glue back
operations/puppet : productionfacter3/puppet5: upgrade canary-bastion host
operations/puppet : productioncanary host: test method for having canary hosts
operations/puppet : productioncanary host: test method for having canary hosts
operations/puppet : productionpuppet: Refactor of the base::puppet class
operations/puppet : productionfacter3: add uniqueid fact
operations/puppet : productionfacter3/puppet5: migrate systems to puppet5/facter3
operations/puppet : productionfacter3/puppet5: migrate systems to puppet5/facter3
operations/puppet : productionpuppet_major_version4: remove old puppet_major_version variable.
operations/puppet : productionfacter3/puppet5: upgrade puppet and facter on canary hosts
operations/puppet : productionDon't install facter 2.4 in buster installs
operations/puppet : productionfacter3/puppet5: Introduce parameters to introduce facter and puppet
operations/puppet : productionfacter3 and puppet5: add repositories for puppet5 and facter3
operations/puppet : productionaptrepo: create new components for facter3 and puppet5

Event Timeline

There are a very large number of changes, so older changes are hidden. Show Older Changes
jbond added a comment.Apr 5 2019, 4:37 PM

@CDanis thanks i will try a patch with some of the other maps however the problem is that the std::unsorted_map is available but it has a bug in the library[1] so the bug may trigger with them as well. Further i didn't think that the performance difference would be that noticeable in facter and as its only facter on jessie the risk is even smaller

[1]https://github.com/gcc-mirror/gcc/commit/6f7e9b8753fa2ea98cc12a512e6a855c9e24e60e

CDanis added a comment.Apr 5 2019, 4:42 PM

Ah, got it. Sorry for not reading more of the context here, just saw that
one line and thought "uh oh" :)

Ah, got it. Sorry for not reading more of the context here, just saw that
one line and thought "uh oh" :)

The alternatives are even worse; clang in jessie has a fixed STL in libc++, but it would have ultimately required to rebuild Boost with libc++ as well (which is used by lots of other packages), so taking a performance hit while retaining the ability to use the stock GCC 4.9 seemed like the best option still.

Change 502201 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] facter3/puppet5: Introduce parameters to introduce facter and puppet

https://gerrit.wikimedia.org/r/502201

Change 501618 abandoned by Jbond:
facter3 and puppet5: add repositories for puppet5 and facter3

Reason:
will use 502201 instead as the refactor this depends on will take longer

https://gerrit.wikimedia.org/r/501618

Change 502234 had a related patch set uploaded (by Muehlenhoff; owner: Muehlenhoff):
[operations/puppet@production] Don't install facter 2.4 in buster installs

https://gerrit.wikimedia.org/r/502234

Change 502201 merged by Jbond:
[operations/puppet@production] facter3/puppet5: Introduce parameters to introduce facter and puppet

https://gerrit.wikimedia.org/r/502201

Change 502234 merged by Muehlenhoff:
[operations/puppet@production] Don't install facter 2.4 in buster installs

https://gerrit.wikimedia.org/r/502234

jbond added a comment.Apr 9 2019, 12:52 PM

below is a diff between facter2 and facter3. most things are the same but there are a few things which are different, of course there are many more new structured facts.

  • blockdevices: order seems to be different
-blockdevices => fd0,sr0,vda
+blockdevices => sr0,fd0,vda
  • the $::ps fact is missing
  • the $::serialnumber fact is missing
  • $::type fact seems to have been replaced with $::chassistype or $facts['dmi']['chassis]

we dont appear to use any of theses facts

full diff

acmechief-test1001 ~ % diff -u --color facter*                                                    [12:50:38]
--- facter2     2019-04-09 12:39:45.004362292 +0000
+++ facter3     2019-04-09 12:40:17.896503535 +0000
@@ -1,4 +1,7 @@
 architecture => amd64
+augeas => {
+  version => "1.11.0"
+}
 augeasversion => 1.11.0
 bios_release_date => 04/01/2014
 bios_vendor => SeaBIOS
@@ -9,23 +12,68 @@
 blockdevice_sr0_vendor => QEMU
 blockdevice_vda_size => 10737418240
 blockdevice_vda_vendor => 0x1af4
-blockdevices => fd0,sr0,vda
-default_routes => {"ipv4"=>"10.64.32.1", "ipv6"=>"fe80::1"}
+blockdevices => sr0,fd0,vda
+chassistype => Other
+default_routes => {
+  ipv4 => "10.64.32.1",
+  ipv6 => "fe80::1"
+}
+disks => {
+  fd0 => {
+    size => "4.00 KiB",
+    size_bytes => 4096
+  },
+  sr0 => {
+    model => "QEMU DVD-ROM",
+    size => "1.00 GiB",
+    size_bytes => 1073741312,
+    vendor => "QEMU"
+  },
+  vda => {
+    size => "10.00 GiB",
+    size_bytes => 10737418240,
+    vendor => "0x1af4"
+  }
+}
+dmi => {
+  bios => {
+    release_date => "04/01/2014",
+    vendor => "SeaBIOS",
+    version => "1.10.2-1"
+  },
+  chassis => {
+    type => "Other"
+  },
+  manufacturer => "QEMU",
+  product => {
+    name => "Standard PC (i440FX + PIIX, 1996)",
+    uuid => "da20b4de-8c39-4e36-8cf8-e913d36b3550"
+  }
+}
 domain => eqiad.wmnet
-facterversion => 2.4.6
+facterversion => 3.11.0
 filesystems => ext2,ext3,ext4
+fips_enabled => false
 fqdn => acmechief-test1001.eqiad.wmnet
 gid => root
 hardwareisa => unknown
 hardwaremodel => x86_64
 hostname => acmechief-test1001
 id => root
+identity => {
+  gid => 0,
+  group => "root",
+  privileged => true,
+  uid => 0,
+  user => "root"
+}
 initsystem => systemd
 interface_primary => ens5
 interfaces => ens5,lo
 ipaddress => 10.64.32.86
 ipaddress6 => 2620:0:861:103:a800:ff:fe43:e618
 ipaddress6_ens5 => 2620:0:861:103:a800:ff:fe43:e618
+ipaddress6_lo => ::1
 ipaddress_ens5 => 10.64.32.86
 ipaddress_lo => 127.0.0.1
 is_pe => false
@@ -34,9 +82,21 @@
 kernelmajversion => 4.19
 kernelrelease => 4.19.0-4-amd64
 kernelversion => 4.19.0
-lldp => {"ens5"=>{"neighbor"=>"ganeti1003.eqiad.wmnet", "port"=>"fe:70:cb:81:6d:ba"}}
-lldp_neighbors => ["ganeti1003.eqiad.wmnet"]
+lldp => {
+  ens5 => {
+    neighbor => "ganeti1003.eqiad.wmnet",
+    port => "fe:70:cb:81:6d:ba"
+  }
+}
+lldp_neighbors => [
+  "ganeti1003.eqiad.wmnet"
+]
 lldp_parent => ganeti1003.eqiad.wmnet
+load_averages => {
+  15m => 0.02,
+  1m => 0.05,
+  5m => 0.07
+}
 lsbdistcodename => buster
 lsbdistdescription => Debian GNU/Linux buster/sid
 lsbdistid => Debian
@@ -45,35 +105,339 @@
 macaddress => aa:00:00:43:e6:18
 macaddress_ens5 => aa:00:00:43:e6:18
 manufacturer => QEMU
-memoryfree => 1.77 GB
-memoryfree_mb => 1811.04
-memorysize => 1.95 GB
+memory => {
+  swap => {
+    available => "975.00 MiB",
+    available_bytes => 1022357504,
+    capacity => "0%",
+    total => "975.00 MiB",
+    total_bytes => 1022357504,
+    used => "0 bytes",
+    used_bytes => 0
+  },
+  system => {
+    available => "1.77 GiB",
+    available_bytes => 1898647552,
+    capacity => "9.25%",
+    total => "1.95 GiB",
+    total_bytes => 2092163072,
+    used => "184.55 MiB",
+    used_bytes => 193515520
+  }
+}
+memoryfree => 1.77 GiB
+memoryfree_mb => 1810.69
+memorysize => 1.95 GiB
 memorysize_mb => 1995.24
+mountpoints => {
+  / => {
+    available => "7.38 GiB",
+    available_bytes => 7927390208,
+    capacity => "16.49%",
+    device => "/dev/vda1",
+    filesystem => "ext4",
+    options => [
+      "rw",
+      "relatime",
+      "errors=remount-ro"
+    ],
+    size => "8.84 GiB",
+    size_bytes => 9492197376,
+    used => "1.46 GiB",
+    used_bytes => 1564807168
+  },
+  /dev/shm => {
+    available => "997.62 MiB",
+    available_bytes => 1046081536,
+    capacity => "0%",
+    device => "tmpfs",
+    filesystem => "tmpfs",
+    options => [
+      "rw",
+      "nosuid",
+      "nodev"
+    ],
+    size => "997.62 MiB",
+    size_bytes => 1046081536,
+    used => "0 bytes",
+    used_bytes => 0
+  },
+  /run => {
+    available => "191.64 MiB",
+    available_bytes => 200953856,
+    capacity => "3.95%",
+    device => "tmpfs",
+    filesystem => "tmpfs",
+    options => [
+      "rw",
+      "nosuid",
+      "noexec",
+      "relatime",
+      "size=204316k",
+      "mode=755"
+    ],
+    size => "199.53 MiB",
+    size_bytes => 209219584,
+    used => "7.88 MiB",
+    used_bytes => 8265728
+  },
+  /run/lock => {
+    available => "5.00 MiB",
+    available_bytes => 5242880,
+    capacity => "0%",
+    device => "tmpfs",
+    filesystem => "tmpfs",
+    options => [
+      "rw",
+      "nosuid",
+      "nodev",
+      "noexec",
+      "relatime",
+      "size=5120k"
+    ],
+    size => "5.00 MiB",
+    size_bytes => 5242880,
+    used => "0 bytes",
+    used_bytes => 0
+  },
+  /run/user/18837 => {
+    available => "199.52 MiB",
+    available_bytes => 209215488,
+    capacity => "0%",
+    device => "tmpfs",
+    filesystem => "tmpfs",
+    options => [
+      "rw",
+      "nosuid",
+      "nodev",
+      "relatime",
+      "size=204312k",
+      "mode=700",
+      "uid=18837",
+      "gid=500"
+    ],
+    size => "199.52 MiB",
+    size_bytes => 209215488,
+    used => "0 bytes",
+    used_bytes => 0
+  },
+  /run/user/20774 => {
+    available => "199.52 MiB",
+    available_bytes => 209215488,
+    capacity => "0%",
+    device => "tmpfs",
+    filesystem => "tmpfs",
+    options => [
+      "rw",
+      "nosuid",
+      "nodev",
+      "relatime",
+      "size=204312k",
+      "mode=700",
+      "uid=20774",
+      "gid=500"
+    ],
+    size => "199.52 MiB",
+    size_bytes => 209215488,
+    used => "0 bytes",
+    used_bytes => 0
+  },
+  /sys/fs/cgroup => {
+    available => "997.62 MiB",
+    available_bytes => 1046081536,
+    capacity => "0%",
+    device => "tmpfs",
+    filesystem => "tmpfs",
+    options => [
+      "ro",
+      "nosuid",
+      "nodev",
+      "noexec",
+      "mode=755"
+    ],
+    size => "997.62 MiB",
+    size_bytes => 1046081536,
+    used => "0 bytes",
+    used_bytes => 0
+  }
+}
 mtu_ens5 => 1500
 mtu_lo => 65536
-net_driver => {"ens5"=>{"speed"=>-1, "duplex"=>"unknown", "driver"=>"virtio_net"}}
+net_driver => {
+  ens5 => {
+    speed => -1,
+    duplex => "unknown",
+    driver => "virtio_net"
+  }
+}
 netmask => 255.255.252.0
+netmask6 => ffff:ffff:ffff:ffff::
+netmask6_ens5 => ffff:ffff:ffff:ffff::
+netmask6_lo => ffff:ffff:ffff:ffff:ffff:ffff:ffff:ffff
 netmask_ens5 => 255.255.252.0
 netmask_lo => 255.0.0.0
+network => 10.64.32.0
+network6 => 2620:0:861:103::
+network6_ens5 => 2620:0:861:103::
+network6_lo => ::1
 network_ens5 => 10.64.32.0
 network_lo => 127.0.0.0
-numa => {"nodes"=>[0], "device_to_node"=>{"ens5"=>[0], "lo"=>[0]}, "device_to_htset"=>{"ens5"=>[[0]], "lo"=>[[0]]}}                                                                                                      
+networking => {
+  domain => "eqiad.wmnet",
+  fqdn => "acmechief-test1001.eqiad.wmnet",
+  hostname => "acmechief-test1001",
+  interfaces => {
+    ens5 => {
+      bindings => [
+        {
+          address => "10.64.32.86",
+          netmask => "255.255.252.0",
+          network => "10.64.32.0"
+        }
+      ],
+      bindings6 => [
+        {
+          address => "2620:0:861:103:a800:ff:fe43:e618",
+          netmask => "ffff:ffff:ffff:ffff::",
+          network => "2620:0:861:103::"
+        },
+        {
+          address => "fe80::a800:ff:fe43:e618",
+          netmask => "ffff:ffff:ffff:ffff::",
+          network => "fe80::"
+        }
+      ],
+      ip => "10.64.32.86",
+      ip6 => "2620:0:861:103:a800:ff:fe43:e618",
+      mac => "aa:00:00:43:e6:18",
+      mtu => 1500,
+      netmask => "255.255.252.0",
+      netmask6 => "ffff:ffff:ffff:ffff::",
+      network => "10.64.32.0",
+      network6 => "2620:0:861:103::"
+    },
+    lo => {
+      bindings => [
+        {
+          address => "127.0.0.1",
+          netmask => "255.0.0.0",
+          network => "127.0.0.0"
+        }
+      ],
+      bindings6 => [
+        {
+          address => "::1",
+          netmask => "ffff:ffff:ffff:ffff:ffff:ffff:ffff:ffff",
+          network => "::1"
+        }
+      ],
+      ip => "127.0.0.1",
+      ip6 => "::1",
+      mtu => 65536,
+      netmask => "255.0.0.0",
+      netmask6 => "ffff:ffff:ffff:ffff:ffff:ffff:ffff:ffff",
+      network => "127.0.0.0",
+      network6 => "::1"
+    }
+  },
+  ip => "10.64.32.86",
+  ip6 => "2620:0:861:103:a800:ff:fe43:e618",
+  mac => "aa:00:00:43:e6:18",
+  mtu => 1500,
+  netmask => "255.255.252.0",
+  netmask6 => "ffff:ffff:ffff:ffff::",
+  network => "10.64.32.0",
+  network6 => "2620:0:861:103::",
+  primary => "ens5"
+}
+numa => {
+  nodes => [
+    0
+  ],
+  device_to_node => {
+    ens5 => [
+      0
+    ],
+    lo => [
+      0
+    ]
+  },
+  device_to_htset => {
+    ens5 => [
+      [
+        0
+      ]
+    ],
+    lo => [
+      [
+        0
+      ]
+    ]
+  }
+}
 operatingsystem => Debian
 operatingsystemmajrelease => buster/sid
 operatingsystemrelease => buster/sid
-os => {"name"=>"Debian", "family"=>"Debian", "release"=>{"major"=>"buster/sid", "full"=>"buster/sid"}, "lsb"=>{"distcodename"=>"buster", "distid"=>"Debian", "distdescription"=>"Debian GNU/Linux buster/sid", "distrelease"=>"testing", "majdistrelease"=>"testing"}}                                                                
+os => {
+  architecture => "amd64",
+  distro => {
+    codename => "buster",
+    description => "Debian GNU/Linux buster/sid",
+    id => "Debian",
+    release => {
+      full => "testing",
+      major => "testing"
+    }
+  },
+  family => "Debian",
+  hardware => "x86_64",
+  name => "Debian",
+  release => {
+    full => "buster/sid",
+    major => "buster/sid"
+  },
+  selinux => {
+    enabled => false
+  }
+}
 osfamily => Debian
 package_provider => apt
-package_updates => <?xml version="1.0" ?><host name="acmechief-test1001.eqiad.wmnet"><package current_version="2.4.6-1+deb10u1" name="facter" new_version="3.11.0-2" origin="Debian" source_name="facter"/><package current_version="2.4-1" name="ferm" new_version="2.4-1+wmf1" origin="Wikimedia" source_name="ferm"/></host>       
-partitions => {"vda2"=>{"size"=>"2", "filesystem"=>"dos"}, "vda5"=>{"uuid"=>"52b215f4-aa2c-4ebb-b341-26bfd7e4010c", "size"=>"1996800", "filesystem"=>"swap"}, "vda1"=>{"uuid"=>"b9149094-189c-4719-9a8e-e7c7e898e64f", "size"=>"18968576", "mount"=>"/", "filesystem"=>"ext4"}}                                                       
+package_updates => <?xml version="1.0" ?><host name="acmechief-test1001.eqiad.wmnet"><package current_version="2.4-1" name="ferm" new_version="2.4-1+wmf1" origin="Wikimedia" source_name="ferm"/></host>                
+partitions => {
+  /dev/vda1 => {
+    filesystem => "ext4",
+    mount => "/",
+    partuuid => "76530831-01",
+    size => "9.04 GiB",
+    size_bytes => 9711910912,
+    uuid => "b9149094-189c-4719-9a8e-e7c7e898e64f"
+  },
+  /dev/vda2 => {
+    size => "1.00 KiB",
+    size_bytes => 1024
+  },
+  /dev/vda5 => {
+    filesystem => "swap",
+    partuuid => "76530831-05",
+    size => "975.00 MiB",
+    size_bytes => 1022361600,
+    uuid => "52b215f4-aa2c-4ebb-b341-26bfd7e4010c"
+  }
+}
 path => /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
 physicalcorecount => 1
 physicalprocessorcount => 1
 processor0 => Intel Xeon E3-12xx v2 (Ivy Bridge)
 processorcount => 1
-processors => {"models"=>["Intel Xeon E3-12xx v2 (Ivy Bridge)"], "count"=>1, "physicalcount"=>1}
+processors => {
+  count => 1,
+  isa => "unknown",
+  models => [
+    "Intel Xeon E3-12xx v2 (Ivy Bridge)"
+  ],
+  physicalcount => 1
+}
 productname => Standard PC (i440FX + PIIX, 1996)
-ps => ps -ef
 puppet_config_dir => /etc/puppet
 puppet_environmentpath => /etc/puppet/code/environments
 puppet_server => puppet
@@ -81,12 +445,42 @@
 puppetversion => 5.5.10
 raid => []
 root_home => /root
+ruby => {
+  platform => "x86_64-linux-gnu",
+  sitedir => "/usr/local/lib/site_ruby/2.5.0",
+  version => "2.5.5"
+}
 rubyplatform => x86_64-linux-gnu
 rubysitedir => /usr/local/lib/site_ruby/2.5.0
 rubyversion => 2.5.5
 selinux => false
-serialnumber => Not Specified
 service_provider => systemd
+ssh => {
+  ecdsa => {
+    fingerprints => {
+      sha1 => "SSHFP 3 1 7fe7d860fa13b4ba1f53bb46ca9835480c056277",
+      sha256 => "SSHFP 3 2 dd5d401d01ab1f7655b4968d49779426435d1db85fd9b91a2e4c2d977188da5d"
+    },
+    key => "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBjt25a1EAcKFoEoayLIumFUeTESc/VgylaWjuUIyB0JGW9YEIkOma8N7/OfaS7ud8rM9XdSqscYwrB4uXeFkBc=",                                                               
+    type => "ecdsa-sha2-nistp256"
+  },
+  ed25519 => {
+    fingerprints => {
+      sha1 => "SSHFP 4 1 4457113a9d2549cb4765e1a1a9300c4b0be790a4",
+      sha256 => "SSHFP 4 2 f16af41b14386fe25e2207ab59200d67b91f23cb2fd245759e1f7399e7e87e0b"
+    },
+    key => "AAAAC3NzaC1lZDI1NTE5AAAAIHincp/L7Q2wJf+Xtw8vTUlR6bZi9h6Osr6mw4bbq4a9",
+    type => "ssh-ed25519"
+  },
+  rsa => {
+    fingerprints => {
+      sha1 => "SSHFP 1 1 ad9bf3016d6405b52735beb035c76c552f19ea70",
+      sha256 => "SSHFP 1 2 d1927a0ecb271552a0f31dc51446f08e02e3366be54405f26cd93f87a7300c85"
+    },
+    key => "AAAAB3NzaC1yc2EAAAADAQABAAABAQDaFu1Ovn8PJqHGWJ1eppwkJSj9n+0Ay6AWBiS/bs/4bJtWf+/9j+aW48dS6qEn6dvkX3Bf6BDA+15ZMgFozn7m2u26qomrkkN3Xua1V1FpPHIE7gtZtsRju0UD/XjClyWS9FB0yO1FRPK9inLm2qAjyK2NFs57TmMwc82bpnVetIlh3wgoEwCW8QvV+ORoCuR66XeL291dU5mPZo5mbwR4eQtb2vm4qgJwMWZUZe6rjGZWG1yRgt5wS1NAu9xi57MrtUFs6xgRxsmyJmasexWtQHJTMBmhRFmJZGXpH1zOmefJo2Yfywfoh9o6cLiqL7BoIjk91pR2BCXZUs/lbesZ",                                                 
+    type => "ssh-rsa"
+  }
+}
 sshecdsakey => AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBjt25a1EAcKFoEoayLIumFUeTESc/VgylaWjuUIyB0JGW9YEIkOma8N7/OfaS7ud8rM9XdSqscYwrB4uXeFkBc=
 sshed25519key => AAAAC3NzaC1lZDI1NTE5AAAAIHincp/L7Q2wJf+Xtw8vTUlR6bZi9h6Osr6mw4bbq4a9
 sshfp_ecdsa => SSHFP 3 1 7fe7d860fa13b4ba1f53bb46ca9835480c056277
@@ -96,17 +490,20 @@
 sshfp_rsa => SSHFP 1 1 ad9bf3016d6405b52735beb035c76c552f19ea70
 SSHFP 1 2 d1927a0ecb271552a0f31dc51446f08e02e3366be54405f26cd93f87a7300c85
 sshrsakey => AAAAB3NzaC1yc2EAAAADAQABAAABAQDaFu1Ovn8PJqHGWJ1eppwkJSj9n+0Ay6AWBiS/bs/4bJtWf+/9j+aW48dS6qEn6dvkX3Bf6BDA+15ZMgFozn7m2u26qomrkkN3Xua1V1FpPHIE7gtZtsRju0UD/XjClyWS9FB0yO1FRPK9inLm2qAjyK2NFs57TmMwc82bpnVetIlh3wgoEwCW8QvV+ORoCuR66XeL291dU5mPZo5mbwR4eQtb2vm4qgJwMWZUZe6rjGZWG1yRgt5wS1NAu9xi57MrtUFs6xgRxsmyJmasexWtQHJTMBmhRFmJZGXpH1zOmefJo2Yfywfoh9o6cLiqL7BoIjk91pR2BCXZUs/lbesZ
-swapfree => 975.00 MB
-swapfree_mb => 975.00
-swapsize => 975.00 MB
-swapsize_mb => 975.00
-system_uptime => {"seconds"=>2967, "hours"=>0, "days"=>0, "uptime"=>"0:49 hours"}
+swapfree => 975.00 MiB
+swapfree_mb => 974.996
+swapsize => 975.00 MiB
+swapsize_mb => 974.996
+system_uptime => {
+  days => 0,
+  hours => 0,
+  seconds => 3003,
+  uptime => "0:50 hours"
+}
 timezone => UTC
-type => Other
-uniqueid => 400a5620
-uptime => 0:49 hours
+uptime => 0:50 hours
 uptime_days => 0
 uptime_hours => 0
-uptime_seconds => 2967
+uptime_seconds => 3003
 uuid => da20b4de-8c39-4e36-8cf8-e913d36b3550
 virtual => kvm

Change 502785 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] puppet_major_version4: remove old puppet_major_version variable.

https://gerrit.wikimedia.org/r/502785

Change 504006 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] facter3/puppet5: upgrade puppet and facter on canary hosts.

https://gerrit.wikimedia.org/r/504006

Change 504006 merged by Jbond:
[operations/puppet@production] facter3/puppet5: upgrade puppet and facter on canary hosts

https://gerrit.wikimedia.org/r/504006

Change 502785 merged by Jbond:
[operations/puppet@production] puppet_major_version4: remove old puppet_major_version variable.

https://gerrit.wikimedia.org/r/502785

Change 504303 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] facter3/puppet5: migrate systems to puppet5/facter3

https://gerrit.wikimedia.org/r/504303

Change 504303 merged by Jbond:
[operations/puppet@production] facter3/puppet5: migrate systems to puppet5/facter3

https://gerrit.wikimedia.org/r/504303

Change 504308 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] facter3/puppet5: migrate systems to puppet5/facter3

https://gerrit.wikimedia.org/r/504308

Change 504308 merged by Jbond:
[operations/puppet@production] facter3/puppet5: migrate systems to puppet5/facter3

https://gerrit.wikimedia.org/r/504308

uniqueid fact is also missing

Change 504322 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] facter3: add uniqueid fact

https://gerrit.wikimedia.org/r/504322

Change 504322 merged by Jbond:
[operations/puppet@production] facter3: add uniqueid fact

https://gerrit.wikimedia.org/r/504322

Change 501617 merged by Jbond:
[operations/puppet@production] puppet: Refactor of the base::puppet class

https://gerrit.wikimedia.org/r/501617

Change 506437 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] canary host: test method for having canary hosts

https://gerrit.wikimedia.org/r/506437

Change 506437 merged by Jbond:
[operations/puppet@production] canary host: test method for having canary hosts

https://gerrit.wikimedia.org/r/506437

Change 506481 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] canary host: test method for having canary hosts

https://gerrit.wikimedia.org/r/506481

Change 506481 merged by Jbond:
[operations/puppet@production] canary host: test method for having canary hosts

https://gerrit.wikimedia.org/r/506481

One thing that will need to be fixed is the detection of HP machines to install 'hp-health' in modules/base/manifests/standard_packages.pp:L141, unfortunately it seems 'dmi' isn't in 2.4 yet, so this will also need to be conditional on facter 2/3.

Change 506613 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] facter3/puppet5: upgrade canary-bastion host

https://gerrit.wikimedia.org/r/506613

Change 506613 merged by Jbond:
[operations/puppet@production] facter3/puppet5: upgrade canary-bastion host

https://gerrit.wikimedia.org/r/506613

Change 506628 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] facter3/puppet5: add version glue back

https://gerrit.wikimedia.org/r/506628

Change 506628 merged by Jbond:
[operations/puppet@production] facter3/puppet5: add version glue back

https://gerrit.wikimedia.org/r/506628

Change 506643 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] facter3/puppet5: upgrade canaries

https://gerrit.wikimedia.org/r/506643

Change 506643 merged by Jbond:
[operations/puppet@production] facter3/puppet5: upgrade canaries

https://gerrit.wikimedia.org/r/506643

Change 506646 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] facter3/puppet5: upgrade canaries

https://gerrit.wikimedia.org/r/506646

Change 506646 merged by Jbond:
[operations/puppet@production] facter3/puppet5: upgrade canaries

https://gerrit.wikimedia.org/r/506646

Change 506650 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] facter3/puppet5: downgrade canaries

https://gerrit.wikimedia.org/r/506650

Change 506651 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] facter3/puppet5: downgrade canaries

https://gerrit.wikimedia.org/r/506651

Change 506652 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] puppet5/facter3: update canary

https://gerrit.wikimedia.org/r/506652

Change 506650 merged by Jbond:
[operations/puppet@production] facter3/puppet5: downgrade canaries

https://gerrit.wikimedia.org/r/506650

Change 506657 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] puppet5/facter3: update canary

https://gerrit.wikimedia.org/r/506657

Change 506657 merged by Jbond:
[operations/puppet@production] puppet5/facter3: update canary

https://gerrit.wikimedia.org/r/506657

Change 506658 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] puppet5/facter3: Revert upgrade until interfaces fact fixed

https://gerrit.wikimedia.org/r/506658

Change 506658 merged by Jbond:
[operations/puppet@production] puppet5/facter3: Revert upgrade until interfaces fact fixed

https://gerrit.wikimedia.org/r/506658

Change 506660 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] puppet5/facter3: update canary

https://gerrit.wikimedia.org/r/506660

Change 506660 merged by Jbond:
[operations/puppet@production] puppet5/facter3: update canary

https://gerrit.wikimedia.org/r/506660

Change 506661 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] puppet5/facter3: update canary

https://gerrit.wikimedia.org/r/506661

Change 506661 merged by Jbond:
[operations/puppet@production] puppet5/facter3: update canary

https://gerrit.wikimedia.org/r/506661

Change 506664 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] puppet5/facter3: update canary

https://gerrit.wikimedia.org/r/506664

Change 506664 merged by Jbond:
[operations/puppet@production] puppet5/facter3: update canary

https://gerrit.wikimedia.org/r/506664

Change 506691 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] puppet5/facter3: update canary

https://gerrit.wikimedia.org/r/506691

Change 506691 merged by Jbond:
[operations/puppet@production] puppet5/facter3: update canary

https://gerrit.wikimedia.org/r/506691

Change 506693 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] puppet5/facter3: update canary

https://gerrit.wikimedia.org/r/506693

Change 506693 merged by Jbond:
[operations/puppet@production] puppet5/facter3: update canary

https://gerrit.wikimedia.org/r/506693

Change 506696 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] puppet5/facter3: update canary

https://gerrit.wikimedia.org/r/506696

Change 506696 merged by Jbond:
[operations/puppet@production] puppet5/facter3: update canary

https://gerrit.wikimedia.org/r/506696

Change 506651 merged by Jbond:
[operations/puppet@production] facter3/puppet5: update interface fact parsing

https://gerrit.wikimedia.org/r/506651

Change 506652 merged by Jbond:
[operations/puppet@production] puppet5/facter3: update canary

https://gerrit.wikimedia.org/r/506652

One thing that will need to be fixed is the detection of HP machines to install 'hp-health' in modules/base/manifests/standard_packages.pp:L141, unfortunately it seems 'dmi' isn't in 2.4 yet, so this will also need to be conditional on facter 2/3.

the manufacture fact is still available in facter3, however this should still be migrated. i will create a child task to migrate all legacy facts

aqs1004 ~ % sudo facter -v
3.11.0
aqs1004 ~ % sudo facter -p manufacturer
HP

Change 507299 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] facter3/puppet5: enable puppet5/facter3 eqsin

https://gerrit.wikimedia.org/r/507299

Change 507300 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] facter3/puppet5: enable puppet5/facter3 esams

https://gerrit.wikimedia.org/r/507300

Change 507301 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] facter3/puppet5: enable puppet5/facter3 ulsfo

https://gerrit.wikimedia.org/r/507301

Change 507302 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] facter3/puppet5: enable puppet5/facter3 codfw

https://gerrit.wikimedia.org/r/507302

Change 507303 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] facter3/puppet5: enable puppet5/facter3 eqiad

https://gerrit.wikimedia.org/r/507303

Change 507305 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] facter3/puppet5: clean up old config

https://gerrit.wikimedia.org/r/507305

Change 507299 merged by Jbond:
[operations/puppet@production] facter3/puppet5: enable puppet5/facter3 eqsin

https://gerrit.wikimedia.org/r/507299

FYI the upgrade seems to be generating cronspam, in the form of facter warnings:

Subject: Cron <root@cp5001> /usr/local/sbin/smart-data-dump --syslog --outfile /var/lib/prometheus/node.d/device_smart.prom
2019-04-30 16:34:02.691553 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.64.0.131 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.691678 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.64.0.133 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.691716 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.64.16.23 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.691750 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.64.16.25 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.691784 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.64.32.68 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.691822 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.64.32.70 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.691857 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.64.48.102 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.691890 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.64.48.104 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.691929 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.0.123 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.691963 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.0.126 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.691996 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.16.134 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.692029 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.16.137 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.692062 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.32.113 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.692095 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.32.116 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.692128 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.32.117 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.692161 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.48.24 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.692195 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.48.26 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.692227 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.48.28 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.692260 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.48.29 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.692293 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.48.30 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.693333 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:101:10:192:0:123 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693388 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:101:10:192:0:126 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693431 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:102:10:192:16:134 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693469 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:102:10:192:16:137 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693505 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:103:10:192:32:113 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693541 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:103:10:192:32:116 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693576 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:103:10:192:32:117 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693612 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:104:10:192:48:24 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693648 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:104:10:192:48:26 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693684 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:104:10:192:48:28 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693720 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:104:10:192:48:29 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693756 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:104:10:192:48:30 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693791 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:861:101:10:64:0:131 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693827 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:861:101:10:64:0:133 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693863 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:861:102:10:64:16:23 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693899 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:861:102:10:64:16:25 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693941 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:861:103:10:64:32:68 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693978 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:861:103:10:64:32:70 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.694014 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:861:107:10:64:48:102 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.694049 WARN  puppetlabs.facter - Could not process routing table entry: Expected a

Change 507300 merged by Jbond:
[operations/puppet@production] facter3/puppet5: enable puppet5/facter3 esams

https://gerrit.wikimedia.org/r/507300

Change 507301 merged by Jbond:
[operations/puppet@production] facter3/puppet5: enable puppet5/facter3 ulsfo

https://gerrit.wikimedia.org/r/507301

jbond added a comment.May 7 2019, 10:26 AM

FYI the upgrade seems to be generating cronspam, in the form of facter warnings:

Subject: Cron <root@cp5001> /usr/local/sbin/smart-data-dump --syslog --outfile /var/lib/prometheus/node.d/device_smart.prom
2019-04-30 16:34:02.691553 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.64.0.131 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.691678 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.64.0.133 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.691716 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.64.16.23 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.691750 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.64.16.25 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.691784 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.64.32.68 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.691822 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.64.32.70 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.691857 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.64.48.102 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.691890 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.64.48.104 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.691929 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.0.123 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.691963 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.0.126 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.691996 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.16.134 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.692029 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.16.137 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.692062 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.32.113 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.692095 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.32.116 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.692128 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.32.117 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.692161 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.48.24 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.692195 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.48.26 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.692227 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.48.28 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.692260 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.48.29 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.692293 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '10.192.48.30 via 10.132.0.1 dev enp5s0f0  mtu lock 1450'
2019-04-30 16:34:02.693333 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:101:10:192:0:123 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693388 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:101:10:192:0:126 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693431 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:102:10:192:16:134 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693469 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:102:10:192:16:137 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693505 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:103:10:192:32:113 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693541 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:103:10:192:32:116 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693576 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:103:10:192:32:117 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693612 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:104:10:192:48:24 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693648 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:104:10:192:48:26 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693684 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:104:10:192:48:28 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693720 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:104:10:192:48:29 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693756 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:860:104:10:192:48:30 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693791 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:861:101:10:64:0:131 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693827 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:861:101:10:64:0:133 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693863 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:861:102:10:64:16:23 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693899 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:861:102:10:64:16:25 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693941 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:861:103:10:64:32:68 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.693978 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:861:103:10:64:32:70 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.694014 WARN  puppetlabs.facter - Could not process routing table entry: Expected a destination followed by key/value pairs, got '2620:0:861:107:10:64:48:102 via fe80::66c3:d602:8bc:c7f1 dev enp5s0f0 metric 1024  mtu lock 1450 pref medium'
2019-04-30 16:34:02.694049 WARN  puppetlabs.facter - Could not process routing table entry: Expected a

Sorry @fgiunchedi i must have missed this comment. This issue is being tracked in https://phabricator.wikimedia.org/T222356

Change 507302 merged by Jbond:
[operations/puppet@production] facter3/puppet5: enable puppet5/facter3 codfw

https://gerrit.wikimedia.org/r/507302

Change 507303 merged by Jbond:
[operations/puppet@production] facter3/puppet5: enable puppet5/facter3 eqiad

https://gerrit.wikimedia.org/r/507303

Change 509040 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] puppet5/facter3: ensure puppet master infrastructre is not upgraded

https://gerrit.wikimedia.org/r/509040

Change 509042 had a related patch set uploaded (by Jbond; owner: John Bond):
[operations/puppet@production] puppet5/facter3: ensure puppet master infrastructre is not upgraded

https://gerrit.wikimedia.org/r/509042

Change 509042 merged by Jbond:
[operations/puppet@production] puppet5/facter3: ensure puppet master infrastructre is not upgraded

https://gerrit.wikimedia.org/r/509042

Change 509040 abandoned by Jbond:
puppet5/facter3: ensure puppet master infrastructre is not upgraded

Reason:
superseded by regex change

https://gerrit.wikimedia.org/r/509040

Change 507305 merged by Jbond:
[operations/puppet@production] facter3/puppet5: clean up old config

https://gerrit.wikimedia.org/r/507305

Mentioned in SAL (#wikimedia-operations) [2019-05-13T12:49:59Z] <moritzm> updating puppetdb on deployment-puppetdb02 to 4.4.0-1~wmf2 (T219803)

Mentioned in SAL (#wikimedia-operations) [2019-05-13T13:25:54Z] <moritzm> uploaded puppetdb 4.4.0-1~wmf2 to component/puppetdb4 for apt.wikimedia.org/stretch-wikimedia (T219803)

Mentioned in SAL (#wikimedia-operations) [2019-05-13T13:46:56Z] <moritzm> updating puppet on deployment-puppetmaster03 to 4.8.2-5+wmf1 (T219803)

Mentioned in SAL (#wikimedia-operations) [2019-05-13T14:05:47Z] <moritzm> uploaded puppet 4.8.2-5+wmf1 to component/puppetdb4 for apt.wikimedia.org/stretch-wikimedia (T219803)

puppet-common is a transitional package and no longer needed, we currently have it installed on 282 hosts, it's probably best to simply remove these to prevent confusion: https://debmonitor.wikimedia.org/packages/puppet-common

Change 509852 had a related patch set uploaded (by Muehlenhoff; owner: Muehlenhoff):
[operations/puppet@production] Switch deployment-prep to facter 3 / puppet 5

https://gerrit.wikimedia.org/r/509852

Change 509852 merged by Muehlenhoff:
[operations/puppet@production] Switch deployment-prep to facter 3 / puppet 5

https://gerrit.wikimedia.org/r/509852

Change 510171 had a related patch set uploaded (by Muehlenhoff; owner: Muehlenhoff):
[operations/puppet@production] Switch puppetdb1001/1002 to facter 3/puppet 5

https://gerrit.wikimedia.org/r/510171

facter upgrade in deployment-prep appears to have set the ipaddress6 fact to link-local addresses

jbond added a comment.EditedMay 14 2019, 8:59 PM

after speching with alex it seems that facter2 set ipaddress6 to undef if there where only link-local adrdesses however in facter3 it has the value of the link-local address (however more investigation required)

git/puppet [ grep -r ipaddress6 ./modules                         review/jbond/update_python ] 9:57 PM
./modules/base/lib/facter/interface_primary.rb:Facter.add('ipaddress6') do
./modules/base/lib/facter/interface_primary.rb:    # Do not rely on ipaddress6_#{interface_primary}, as its underlying
./modules/interface/manifests/add_ip6_mapped.pp:    $ipv6_address = inline_template("<%= require 'ipaddr'; (IPAddr.new(scope.lookupvar(\"::ipaddress6_${interface}\")).mask(64) | IPAddr.new(@v6_mapped_lower64)).to_s() %>")
./modules/profile/templates/exim/exim4.conf.mailman.erb:        interface = <; <%= @ipaddress %> ; <%= @ipaddress6 %>
./modules/profile/templates/exim/exim4.conf.mailman.erb:        interface = <; <%= @ipaddress %> ; <%= @ipaddress6 %>
./modules/profile/manifests/dnsrecursor.pp:            $facts['ipaddress6'],
./modules/profile/manifests/dnsrecursor.pp:    ::dnsrecursor::monitor { [ $facts['ipaddress'], $facts['ipaddress6'] ]: }
./modules/profile/manifests/pybal.pp:        'bgp-nexthop-ipv6'    => inline_template("<%= require 'ipaddr'; (IPAddr.new(@ipaddress6).mask(64) | IPAddr.new(\"::\" + @ipaddress.gsub('.', ':'))).to_s() %>"),
./modules/profile/manifests/openstack/base/pdns/auth/service.pp:        dns_auth_ipaddress6    => $facts['ipaddress6'],
./modules/ssh/manifests/server.pp:    if $::ipaddress6 == undef {
./modules/ssh/manifests/server.pp:        $aliases = [ $::hostname, $::ipaddress, $::ipaddress6 ]
./modules/standard/spec/default_module_facts.yml:ipaddress6: 2001:db8::42
./modules/calico/templates/initscripts/calico-node.systemd.erb:  -e IP6=<%= @ipaddress6 %> \
./modules/pdns_server/templates/pdns.conf.erb:<% if @dns_auth_ipaddress6 then %>local-ipv6=<%= @dns_auth_ipaddress6 %><% end %>
./modules/pdns_server/manifests/init.pp:# - $dns_auth_ipaddress6:IPv6 address PowerDNS will bind to and send packets from
./modules/pdns_server/manifests/init.pp:    $dns_

not to check theses code points

Change 510171 abandoned by Muehlenhoff:
Switch puppetdb1001/1002 to facter 3/puppet 5

https://gerrit.wikimedia.org/r/510171

jbond moved this task from Unsorted 💣 to Active 🚁 on the User-jbond board.Wed, Oct 30, 4:54 PM
jbond added a comment.Wed, Oct 30, 5:29 PM

Just waiting for the puppetdb's t get upgraded

jbond closed this task as Resolved.Thu, Nov 7, 5:28 PM

This is now complte