Page MenuHomePhabricator
Paste P7573

(An Untitled Masterwork)
ActivePublic

Authored by Paladox on Sep 20 2018, 2:21 PM.
root@puppet-paladox:/etc# puppet agent --enable && puppet agent -tv
Info: Using configured environment 'production'
Info: Retrieving pluginfacts
Info: Retrieving plugin
Info: Loading facts
Info: Caching catalog for puppet-paladox.git.eqiad.wmflabs
Notice: /Stage[main]/Base::Environment/Tidy[/var/tmp/core]: Tidying 0 files
Info: Applying configuration version '1537452631'
Notice: /Stage[main]/Ferm/Package[ferm]/ensure: created
Notice: /Stage[main]/Ferm/File[/etc/ferm/ferm.conf]/content:
--- /etc/ferm/ferm.conf 2017-06-06 10:40:08.000000000 +0000
+++ /tmp/puppet-file20180920-6657-1rpl7h6 2018-09-20 14:10:43.492000000 +0000
@@ -1,52 +1,3 @@
-# -*- shell-script -*-
-#
-# Configuration file for ferm(1).
-#
+@include 'functions.conf';
-table filter {
- chain INPUT {
- policy DROP;
-
- # connection tracking
- mod state state INVALID DROP;
- mod state state (ESTABLISHED RELATED) ACCEPT;
-
- # allow local packet
- interface lo ACCEPT;
-
- # respond to ping
- proto icmp ACCEPT;
-
- # allow IPsec
- proto udp dport 500 ACCEPT;
- proto (esp ah) ACCEPT;
-
- # allow SSH connections
- proto tcp dport ssh ACCEPT;
- }
- chain OUTPUT {
- policy ACCEPT;
-
- # connection tracking
- #mod state state INVALID DROP;
- mod state state (ESTABLISHED RELATED) ACCEPT;
- }
- chain FORWARD {
- policy DROP;
-
- # connection tracking
- mod state state INVALID DROP;
- mod state state (ESTABLISHED RELATED) ACCEPT;
- }
-}
-
-# IPv6:
-#domain ip6 {
-# table filter {
-# chain INPUT {
-# policy ACCEPT;
-# # ...
-# }
-# # ...
-# }
-#}
+@include 'conf.d/';
Info: Computing checksum on file /etc/ferm/ferm.conf
Info: /Stage[main]/Ferm/File[/etc/ferm/ferm.conf]: Filebucketed /etc/ferm/ferm.conf to puppet with sum 91410f27613e600a8892d2a7076d1bcf
Notice: /Stage[main]/Ferm/File[/etc/ferm/ferm.conf]/content: content changed '{md5}91410f27613e600a8892d2a7076d1bcf' to '{md5}4bea2934a124683725db912836697b1a'
Notice: /Stage[main]/Ferm/File[/etc/ferm/ferm.conf]/group: group changed 'adm' to 'root'
Notice: /Stage[main]/Ferm/File[/etc/ferm/ferm.conf]/mode: mode changed '0644' to '0400'
Info: /Stage[main]/Ferm/File[/etc/ferm/ferm.conf]: Scheduling refresh of Service[ferm]
Info: /Stage[main]/Ferm/File[/etc/ferm/ferm.conf]: Scheduling refresh of Service[ferm]
Info: /Stage[main]/Ferm/File[/etc/ferm/ferm.conf]: Scheduling refresh of Service[ferm]
Notice: /Stage[main]/Ferm/File[/etc/ferm/functions.conf]/group: group changed 'adm' to 'root'
Info: /Stage[main]/Ferm/File[/etc/ferm/functions.conf]: Scheduling refresh of Service[ferm]
Notice: /Stage[main]/Role::Prometheus::Node_exporter/Ferm::Service[prometheus-node-exporter]/File[/etc/ferm/conf.d/10_prometheus-node-exporter]/group: group changed 'adm' to 'root'
Info: /Stage[main]/Role::Prometheus::Node_exporter/Ferm::Service[prometheus-node-exporter]/File[/etc/ferm/conf.d/10_prometheus-node-exporter]: Scheduling refresh of Service[ferm]
Notice: /Stage[main]/Ferm/File[/etc/default/ferm]/content:
--- /etc/default/ferm 2018-09-20 14:10:42.676000000 +0000
+++ /tmp/puppet-file20180920-6657-1i3sqlp 2018-09-20 14:10:43.816000000 +0000
@@ -4,10 +4,11 @@
FAST=yes
# cache the output of ferm --lines in /var/cache/ferm?
-CACHE=yes
+CACHE=no
-# additional paramaters for ferm (like --def '=bar')
+# additional paramaters for ferm (like --def '$foo=bar')
OPTIONS=
-# Enable the ferm init script? (i.e. run on bootup)
-ENABLED="yes"
+# Enable ferm on bootup?
+ENABLED=yes
+
Info: Computing checksum on file /etc/default/ferm
Info: /Stage[main]/Ferm/File[/etc/default/ferm]: Filebucketed /etc/default/ferm to puppet with sum a4daba7939f6be9a87f26f1a89324806
Notice: /Stage[main]/Ferm/File[/etc/default/ferm]/content: content changed '{md5}a4daba7939f6be9a87f26f1a89324806' to '{md5}3e9b11c20066c1658ab353e597ea8e5e'
Notice: /Stage[main]/Ferm/File[/etc/default/ferm]/mode: mode changed '0644' to '0400'
Info: /Stage[main]/Ferm/File[/etc/default/ferm]: Scheduling refresh of Service[ferm]
Info: /Stage[main]/Ferm/File[/etc/default/ferm]: Scheduling refresh of Service[ferm]
Notice: /Stage[main]/Puppetmaster::Puppetdb::Database/File[/etc/postgresql/9.6/main/tuning.conf]/content:
--- /etc/postgresql/9.6/main/tuning.conf 2018-04-04 14:03:20.340198265 +0000
+++ /tmp/puppet-file20180920-6657-shehqr 2018-09-20 14:10:44.004000000 +0000
@@ -3,5 +3,5 @@
effective_cache_size = 8GB
work_mem = 192MB
wal_buffers = 8MB
-shared_buffers = 7680MB
+shared_buffers = 768MB
max_connections = 120
Info: Computing checksum on file /etc/postgresql/9.6/main/tuning.conf
Info: /Stage[main]/Puppetmaster::Puppetdb::Database/File[/etc/postgresql/9.6/main/tuning.conf]: Filebucketed /etc/postgresql/9.6/main/tuning.conf to puppet with sum 762ce54bd2c21376519a5ccb948c3413
Notice: /Stage[main]/Puppetmaster::Puppetdb::Database/File[/etc/postgresql/9.6/main/tuning.conf]/content: content changed '{md5}762ce54bd2c21376519a5ccb948c3413' to '{md5}f06e35de44133c70760df5680ef03a41'
Notice: /Stage[main]/Postgresql::Server/File[/etc/postgresql/9.6/main/postgresql.conf]/content:
--- /etc/postgresql/9.6/main/postgresql.conf 2018-04-04 14:42:59.496768703 +0000
+++ /tmp/puppet-file20180920-6657-1ldn9v3 2018-09-20 14:10:44.040000000 +0000
@@ -495,7 +495,7 @@
#------------------------------------------------------------------------------
#custom_variable_classes = '' # list of custom variable class names
-#nclude 'tuning.conf'
+include 'tuning.conf'
include 'master.conf'
# SSL configuration
include 'ssl.conf'
Info: Computing checksum on file /etc/postgresql/9.6/main/postgresql.conf
Info: /Stage[main]/Postgresql::Server/File[/etc/postgresql/9.6/main/postgresql.conf]: Filebucketed /etc/postgresql/9.6/main/postgresql.conf to puppet with sum af1ce7e3dc4face3c306f3d495cc6df1
Notice: /Stage[main]/Postgresql::Server/File[/etc/postgresql/9.6/main/postgresql.conf]/content: content changed '{md5}af1ce7e3dc4face3c306f3d495cc6df1' to '{md5}89a53f727ca31493bb052af9588431f7'
Notice: /Stage[main]/Puppetmaster::Scripts/File[/usr/local/bin/puppet-merge]/content:
--- /usr/local/bin/puppet-merge 2018-04-11 15:09:04.417652257 +0000
+++ /tmp/puppet-file20180920-6657-1iw3kc4 2018-09-20 14:10:44.304000000 +0000
@@ -192,3 +192,26 @@
# cause all remaining masters to be aborted and left out of sync.
set +e
+# Note: The "true" command is passed on purpose to show that the command passed
+# to the SSH sessions is irrelevant. It's the SSH forced command trick on the
+# worker end that does the actual work. Note that the $sha1 however is important
+
+if [ -z ${sha1} ]; then # Only loop through the other servers if called without sha1
+ if [ $running_user = $git_user ]; then
+ ssh -t -t true ${fetch_head_sha1} 2>&1
+ else
+ su - $git_user -c "ssh -t -t true ${fetch_head_sha1} 2>&1"
+ fi
+ if [ $? -eq 0 ]; then
+ echo "${GREEN}OK${RESET}: puppet-merge on succeeded"
+ else
+ echo "${RED}ERROR${RESET}: puppet-merge on failed"
+ fi
+ # avoid a syntax error if this list is empty
+ true
+fi
+# conftool-merge does need to run from >1 frontend, avoid running a second time
+if [ $running_user != $git_user ]; then
+ echo "Now running conftool-merge to sync any changes to conftool data"
+ /usr/local/bin/conftool-merge
+fi
Info: Computing checksum on file /usr/local/bin/puppet-merge
Info: /Stage[main]/Puppetmaster::Scripts/File[/usr/local/bin/puppet-merge]: Filebucketed /usr/local/bin/puppet-merge to puppet with sum a73c36ac693b7363a8ddf634c8eab877
Notice: /Stage[main]/Puppetmaster::Scripts/File[/usr/local/bin/puppet-merge]/content: content changed '{md5}a73c36ac693b7363a8ddf634c8eab877' to '{md5}37927711830373893e74f4986e019d9b'
Notice: /Stage[main]/Base::Firewall/Ferm::Conf[defs]/File[/etc/ferm/conf.d/00_defs]/content:
--- /etc/ferm/conf.d/00_defs 2018-04-04 14:03:21.172199015 +0000
+++ /tmp/puppet-file20180920-6657-elyci3 2018-09-20 14:10:44.832000000 +0000
@@ -2,29 +2,29 @@
@def $INTERNAL = (10.0.0.0/8 2620:0:860:100::/56 2620:0:861:100::/56 2620:0:862:100::/56 2620:0:863:100::/56);
# $DOMAIN_NETWORKS is a set of all networks belonging to a domain.
# a domain is a realm currently, but the notion is more generic than that on purpose
-@def $DOMAIN_NETWORKS = (10.196.0.0/24 10.196.16.0/21 10.196.32.0/24 10.196.48.0/24 10.68.0.0/24 10.68.16.0/21 10.68.32.0/24 10.68.48.0/24 2620:0:860:201::/64 2620:0:860:202::/64 2620:0:860:203::/64 2620:0:860:204::/64 2620:0:861:201::/64 2620:0:861:202::/64 2620:0:861:203::/64 2620:0:861:204::/64 );
+@def $DOMAIN_NETWORKS = (10.196.0.0/24 10.196.16.0/21 10.196.32.0/24 10.196.48.0/24 10.68.0.0/24 10.68.16.0/21 10.68.32.0/24 10.68.48.0/24 172.16.0.0/21 172.16.128.0/21 2620:0:860:201::/64 2620:0:860:202::/64 2620:0:860:203::/64 2620:0:860:204::/64 2620:0:861:201::/64 2620:0:861:202::/64 2620:0:861:203::/64 2620:0:861:204::/64 );
# $PRODUCTION_NETWORKS is a set of all production networks
@def $PRODUCTION_NETWORKS = (10.128.0.0/24 10.132.0.0/24 10.192.0.0/22 10.192.16.0/22 10.192.20.0/24 10.192.21.0/24 10.192.32.0/22 10.192.48.0/22 10.192.64.0/21 10.192.72.0/24 10.2.1.0/24 10.2.2.0/24 10.2.3.0/24 10.2.4.0/24 10.2.5.0/24 10.20.0.0/24 10.64.0.0/22 10.64.16.0/22 10.64.20.0/24 10.64.21.0/24 10.64.32.0/22 10.64.36.0/24 10.64.37.0/24 10.64.4.0/24 10.64.48.0/22 10.64.5.0/24 10.64.52.0/24 10.64.53.0/24 10.64.64.0/21 10.64.72.0/24 10.64.75.0/24 10.64.76.0/24 103.102.166.0/28 103.102.166.224/27 198.35.26.0/28 198.35.26.96/27 2001:df2:e500:101::/64 2001:df2:e500:1::/64 2001:df2:e500:ed1a::/64 208.80.153.0/27 208.80.153.224/27 208.80.153.32/27 208.80.153.64/27 208.80.153.96/27 208.80.154.0/26 208.80.154.128/26 208.80.154.224/27 208.80.154.64/26 208.80.155.96/27 2620:0:860:101::/64 2620:0:860:102::/64 2620:0:860:103::/64 2620:0:860:104::/64 2620:0:860:118::/64 2620:0:860:122::/64 2620:0:860:1::/64 2620:0:860:2::/64 2620:0:860:3::/64 2620:0:860:4::/64 2620:0:860:cabe::/64 2620:0:860:ed1a::/64 2620:0:861:101::/64 2620:0:861:102::/64 2620:0:861:103::/64 2620:0:861:104::/64 2620:0:861:105::/64 2620:0:861:106::/64 2620:0:861:107::/64 2620:0:861:108::/64 2620:0:861:117::/64 2620:0:861:118::/64 2620:0:861:119::/64 2620:0:861:1::/64 2620:0:861:2::/64 2620:0:861:3::/64 2620:0:861:4::/64 2620:0:861:cabe::/64 2620:0:861:ed1a::/64 2620:0:862:102::/64 2620:0:862:1::/64 2620:0:862:ed1a::/64 2620:0:863:101::/64 2620:0:863:1::/64 2620:0:863:ed1a::/64 91.198.174.0/25 91.198.174.192/27 );
# $LABS_NETWORKS is meant to be a set of all labs networks
-@def $LABS_NETWORKS = (10.196.0.0/24 10.196.16.0/21 10.196.32.0/24 10.196.48.0/24 10.68.0.0/24 10.68.16.0/21 10.68.32.0/24 10.68.48.0/24 2620:0:860:201::/64 2620:0:860:202::/64 2620:0:860:203::/64 2620:0:860:204::/64 2620:0:861:201::/64 2620:0:861:202::/64 2620:0:861:203::/64 2620:0:861:204::/64 );
+@def $LABS_NETWORKS = (10.196.0.0/24 10.196.16.0/21 10.196.32.0/24 10.196.48.0/24 10.68.0.0/24 10.68.16.0/21 10.68.32.0/24 10.68.48.0/24 172.16.0.0/21 172.16.128.0/21 2620:0:860:201::/64 2620:0:860:202::/64 2620:0:860:203::/64 2620:0:860:204::/64 2620:0:861:201::/64 2620:0:861:202::/64 2620:0:861:203::/64 2620:0:861:204::/64 );
# $FRACK_NETWORKS is meant to be a set of all fundraising networks
@def $FRACK_NETWORKS = (10.195.0.0/27 10.195.0.32/27 10.195.0.64/29 10.195.0.72/29 10.195.0.80/29 10.195.0.96/27 10.64.40.0/27 10.64.40.128/27 10.64.40.160/27 10.64.40.32/27 10.64.40.64/27 10.64.40.96/27 208.80.152.224/28 208.80.155.0/27 );
# Temporarily include flevorium (10.64.48.112/32) and furud (10.192.16.65/32)
# in ANALYTICS_NETWORKS for backup purposes, see: T176506
@def $ANALYTICS_NETWORKS = (10.64.21.0/24 10.64.36.0/24 10.64.5.0/24 10.64.53.0/24 2620:0:861:104::/64 2620:0:861:105::/64 2620:0:861:106::/64 2620:0:861:108::/64 10.64.48.112/32 10.192.16.65/32 );
-@def $MW_APPSERVER_NETWORKS = (10.196.0.0/24 10.196.16.0/21 10.196.32.0/24 10.196.48.0/24 10.68.0.0/24 10.68.16.0/21 10.68.32.0/24 10.68.48.0/24 2620:0:860:201::/64 2620:0:860:202::/64 2620:0:860:203::/64 2620:0:860:204::/64 2620:0:861:201::/64 2620:0:861:202::/64 2620:0:861:203::/64 2620:0:861:204::/64 127.0.0.1 );
+@def $MW_APPSERVER_NETWORKS = (10.196.0.0/24 10.196.16.0/21 10.196.32.0/24 10.196.48.0/24 10.68.0.0/24 10.68.16.0/21 10.68.32.0/24 10.68.48.0/24 172.16.0.0/21 172.16.128.0/21 2620:0:860:201::/64 2620:0:860:202::/64 2620:0:860:203::/64 2620:0:860:204::/64 2620:0:861:201::/64 2620:0:861:202::/64 2620:0:861:203::/64 2620:0:861:204::/64 127.0.0.1 );
@def $NETWORK_INFRA = (91.198.174.224/27 2620:0:862:fe00::/55 198.35.26.192/27 2620:0:863:fe00::/55 208.80.153.192/27 2620:0:860:fe00::/55 208.80.154.192/27 2620:0:861:fe00::/55 103.102.166.128/27 2001:df2:e500:fe00::/56 );
@def $MGMT_NETWORKS = (10.65.0.0/16 10.128.128.0/17 10.193.0.0/16 10.21.0.0/24 10.132.128.0/17 );
-@def $BASTION_HOSTS = (10.68.17.232 10.68.18.65 10.68.18.66 10.68.18.68 );
-@def $CACHE_MISC = (10.68.21.68 );
-@def $CUMIN_MASTERS = (10.68.18.66 10.68.18.68 );
+@def $BASTION_HOSTS = (10.68.17.232 10.68.18.65 10.68.18.66 10.68.18.68 172.16.1.136 172.16.1.210 172.16.1.211 172.16.1.135 );
+@def $CACHES = (10.68.21.68 );
+@def $CUMIN_MASTERS = (10.68.18.66 10.68.18.68 172.16.1.211 172.16.1.135 );
@def $CUMIN_REAL_MASTERS = (208.80.154.158 2620:0:861:2:208:80:154:158 208.80.155.120 2620:0:861:4:208:80:155:120 );
-@def $DEPLOYMENT_HOSTS = (10.68.21.205 10.68.20.135 );
+@def $DEPLOYMENT_HOSTS = (10.68.23.38 10.68.23.98 );
@def $MAINTENANCE_HOSTS = ( );
@def $MONITORING_HOSTS = (10.68.16.210 );
@@ -84,6 +84,10 @@
@def $EQIAD_PUBLIC_FRACK_EXTERNAL1_C_EQIAD_IPV4 = (208.80.155.0/27);
@def $EQIAD_PUBLIC_FRACK_EXTERNAL1_C_EQIAD = ($EQIAD_PUBLIC_FRACK_EXTERNAL1_C_EQIAD_IPV4 );
+# Realm: labs, # Site: codfw, # Sphere: private, # Network: cloud-instances2-b-codfw
+@def $CODFW_PRIVATE_CLOUD_INSTANCES2_B_CODFW_IPV4 = (172.16.128.0/21);
+@def $CODFW_PRIVATE_CLOUD_INSTANCES2_B_CODFW = ($CODFW_PRIVATE_CLOUD_INSTANCES2_B_CODFW_IPV4 );
+
# Realm: labs, # Site: codfw, # Sphere: private, # Network: labs-instances1-a-codfw
@def $CODFW_PRIVATE_LABS_INSTANCES1_A_CODFW_IPV4 = (10.196.0.0/24);
@def $CODFW_PRIVATE_LABS_INSTANCES1_A_CODFW_IPV6 = (2620:0:860:201::/64);
@@ -104,6 +108,10 @@
@def $CODFW_PRIVATE_LABS_INSTANCES1_D_CODFW_IPV6 = (2620:0:860:204::/64);
@def $CODFW_PRIVATE_LABS_INSTANCES1_D_CODFW = ($CODFW_PRIVATE_LABS_INSTANCES1_D_CODFW_IPV4 $CODFW_PRIVATE_LABS_INSTANCES1_D_CODFW_IPV6 );
+# Realm: labs, # Site: eqiad, # Sphere: private, # Network: cloud-instances2-b-eqiad
+@def $EQIAD_PRIVATE_CLOUD_INSTANCES2_B_EQIAD_IPV4 = (172.16.0.0/21);
+@def $EQIAD_PRIVATE_CLOUD_INSTANCES2_B_EQIAD = ($EQIAD_PRIVATE_CLOUD_INSTANCES2_B_EQIAD_IPV4 );
+
# Realm: labs, # Site: eqiad, # Sphere: private, # Network: labs-instances1-a-eqiad
@def $EQIAD_PRIVATE_LABS_INSTANCES1_A_EQIAD_IPV4 = (10.68.0.0/24);
@def $EQIAD_PRIVATE_LABS_INSTANCES1_A_EQIAD_IPV6 = (2620:0:861:201::/64);
Info: Computing checksum on file /etc/ferm/conf.d/00_defs
Info: /Stage[main]/Base::Firewall/Ferm::Conf[defs]/File[/etc/ferm/conf.d/00_defs]: Filebucketed /etc/ferm/conf.d/00_defs to puppet with sum 2a884f210d6e57000ddfab0c6c574db3
Notice: /Stage[main]/Base::Firewall/Ferm::Conf[defs]/File[/etc/ferm/conf.d/00_defs]/content: content changed '{md5}2a884f210d6e57000ddfab0c6c574db3' to '{md5}8aa2bb9b505d0d5d359ea8289e6d89f8'
Notice: /Stage[main]/Base::Firewall/Ferm::Conf[defs]/File[/etc/ferm/conf.d/00_defs]/group: group changed 'adm' to 'root'
Info: /Stage[main]/Base::Firewall/Ferm::Conf[defs]/File[/etc/ferm/conf.d/00_defs]: Scheduling refresh of Service[ferm]
Info: /Stage[main]/Base::Firewall/Ferm::Conf[defs]/File[/etc/ferm/conf.d/00_defs]: Scheduling refresh of Service[ferm]
Notice: /Stage[main]/Base::Firewall/Ferm::Conf[main]/File[/etc/ferm/conf.d/00_main]/group: group changed 'adm' to 'root'
Info: /Stage[main]/Base::Firewall/Ferm::Conf[main]/File[/etc/ferm/conf.d/00_main]: Scheduling refresh of Service[ferm]
Notice: /Stage[main]/Base::Firewall/Ferm::Rule[bastion-ssh]/File[/etc/ferm/conf.d/10_bastion-ssh]/group: group changed 'adm' to 'root'
Info: /Stage[main]/Base::Firewall/Ferm::Rule[bastion-ssh]/File[/etc/ferm/conf.d/10_bastion-ssh]: Scheduling refresh of Service[ferm]
Notice: /Stage[main]/Base::Firewall/Ferm::Rule[monitoring-all]/File[/etc/ferm/conf.d/10_monitoring-all]/group: group changed 'adm' to 'root'
Info: /Stage[main]/Base::Firewall/Ferm::Rule[monitoring-all]/File[/etc/ferm/conf.d/10_monitoring-all]: Scheduling refresh of Service[ferm]
Notice: /Stage[main]/Base::Firewall/Ferm::Service[ssh-from-cumin-masters]/File[/etc/ferm/conf.d/10_ssh-from-cumin-masters]/group: group changed 'adm' to 'root'
Info: /Stage[main]/Base::Firewall/Ferm::Service[ssh-from-cumin-masters]/File[/etc/ferm/conf.d/10_ssh-from-cumin-masters]: Scheduling refresh of Service[ferm]
Notice: Augeas[hba_create-puppetdb@localhost](provider=augeas):
--- /etc/postgresql/9.6/main/pg_hba.conf 2018-02-27 16:23:33.626533545 +0000
+++ /etc/postgresql/9.6/main/pg_hba.conf.augnew 2018-09-20 14:10:45.984000000 +0000
@@ -99,3 +99,4 @@
#host replication postgres ::1/128 md5
host puppetdb puppetdb 10.68.20.181/32 md5
local postgres prometheus peer
+host puppetdb puppetdb 172.16.1.179/32 md5
Notice: /Stage[main]/Puppetmaster::Puppetdb::Database/Postgresql::User[puppetdb@localhost]/Augeas[hba_create-puppetdb@localhost]/returns: executed successfully
Info: /Stage[main]/Puppetmaster::Puppetdb::Database/Postgresql::User[puppetdb@localhost]/Augeas[hba_create-puppetdb@localhost]: Scheduling refresh of Exec[pgreload]
Notice: /Stage[main]/Postgresql::Server/Exec[pgreload]: Triggered 'refresh' from 1 events
Notice: /Stage[main]/Prometheus::Postgres_exporter/Base::Service_auto_restart[prometheus-postgres-exporter]/Cron[wmf_auto_restart_prometheus-postgres-exporter]/ensure: created
Notice: /Stage[main]/Profile::Puppetdb::Database/Ferm::Service[postgresql_puppetdb]/File[/etc/ferm/conf.d/10_postgresql_puppetdb]/group: group changed 'adm' to 'root'
Info: /Stage[main]/Profile::Puppetdb::Database/Ferm::Service[postgresql_puppetdb]/File[/etc/ferm/conf.d/10_postgresql_puppetdb]: Scheduling refresh of Service[ferm]
Error: Execution of '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install nginx-full' returned 100: Reading package lists...
Building dependency tree...
Reading state information...
The following additional packages will be installed:
libnginx-mod-http-auth-pam libnginx-mod-http-dav-ext libnginx-mod-http-echo
libnginx-mod-http-geoip libnginx-mod-http-image-filter
libnginx-mod-http-subs-filter libnginx-mod-http-upstream-fair
libnginx-mod-http-xslt-filter libnginx-mod-mail libnginx-mod-stream
nginx-common
Suggested packages:
fcgiwrap nginx-doc
The following NEW packages will be installed:
libnginx-mod-http-auth-pam libnginx-mod-http-dav-ext libnginx-mod-http-echo
libnginx-mod-http-geoip libnginx-mod-http-image-filter
libnginx-mod-http-subs-filter libnginx-mod-http-upstream-fair
libnginx-mod-http-xslt-filter libnginx-mod-mail libnginx-mod-stream
nginx-common nginx-full
0 upgraded, 12 newly installed, 0 to remove and 0 not upgraded.
Need to get 1615 kB of archives.
After this operation, 3056 kB of additional disk space will be used.
Get:1 http://apt.wikimedia.org/wikimedia stretch-wikimedia/main amd64 nginx-common all 1.13.6-2+wmf1 [120 kB]
Get:2 http://apt.wikimedia.org/wikimedia stretch-wikimedia/main amd64 libnginx-mod-http-auth-pam amd64 1.13.6-2+wmf1 [90.5 kB]
Get:3 http://apt.wikimedia.org/wikimedia stretch-wikimedia/main amd64 libnginx-mod-http-dav-ext amd64 1.13.6-2+wmf1 [92.3 kB]
Get:4 http://apt.wikimedia.org/wikimedia stretch-wikimedia/main amd64 libnginx-mod-http-echo amd64 1.13.6-2+wmf1 [102 kB]
Get:5 http://apt.wikimedia.org/wikimedia stretch-wikimedia/main amd64 libnginx-mod-http-geoip amd64 1.13.6-2+wmf1 [91.7 kB]
Get:6 http://apt.wikimedia.org/wikimedia stretch-wikimedia/main amd64 libnginx-mod-http-image-filter amd64 1.13.6-2+wmf1 [95.0 kB]
Get:7 http://apt.wikimedia.org/wikimedia stretch-wikimedia/main amd64 libnginx-mod-http-subs-filter amd64 1.13.6-2+wmf1 [93.6 kB]
Get:8 http://apt.wikimedia.org/wikimedia stretch-wikimedia/main amd64 libnginx-mod-http-upstream-fair amd64 1.13.6-2+wmf1 [93.7 kB]
Get:9 http://apt.wikimedia.org/wikimedia stretch-wikimedia/main amd64 libnginx-mod-http-xslt-filter amd64 1.13.6-2+wmf1 [93.5 kB]
Get:10 http://apt.wikimedia.org/wikimedia stretch-wikimedia/main amd64 libnginx-mod-mail amd64 1.13.6-2+wmf1 [122 kB]
Get:11 http://apt.wikimedia.org/wikimedia stretch-wikimedia/main amd64 libnginx-mod-stream amd64 1.13.6-2+wmf1 [144 kB]
Get:12 http://apt.wikimedia.org/wikimedia stretch-wikimedia/main amd64 nginx-full amd64 1.13.6-2+wmf1 [477 kB]
Preconfiguring packages ...
Fetched 1615 kB in 0s (27.2 MB/s)
Selecting previously unselected package nginx-common.
(Reading database ... 67260 files and directories currently installed.)
Preparing to unpack .../00-nginx-common_1.13.6-2+wmf1_all.deb ...
Unpacking nginx-common (1.13.6-2+wmf1) ...
Selecting previously unselected package libnginx-mod-http-auth-pam.
Preparing to unpack .../01-libnginx-mod-http-auth-pam_1.13.6-2+wmf1_amd64.deb ...
Unpacking libnginx-mod-http-auth-pam (1.13.6-2+wmf1) ...
Selecting previously unselected package libnginx-mod-http-dav-ext.
Preparing to unpack .../02-libnginx-mod-http-dav-ext_1.13.6-2+wmf1_amd64.deb ...
Unpacking libnginx-mod-http-dav-ext (1.13.6-2+wmf1) ...
Selecting previously unselected package libnginx-mod-http-echo.
Preparing to unpack .../03-libnginx-mod-http-echo_1.13.6-2+wmf1_amd64.deb ...
Unpacking libnginx-mod-http-echo (1.13.6-2+wmf1) ...
Selecting previously unselected package libnginx-mod-http-geoip.
Preparing to unpack .../04-libnginx-mod-http-geoip_1.13.6-2+wmf1_amd64.deb ...
Unpacking libnginx-mod-http-geoip (1.13.6-2+wmf1) ...
Selecting previously unselected package libnginx-mod-http-image-filter.
Preparing to unpack .../05-libnginx-mod-http-image-filter_1.13.6-2+wmf1_amd64.deb ...
Unpacking libnginx-mod-http-image-filter (1.13.6-2+wmf1) ...
Selecting previously unselected package libnginx-mod-http-subs-filter.
Preparing to unpack .../06-libnginx-mod-http-subs-filter_1.13.6-2+wmf1_amd64.deb ...
Unpacking libnginx-mod-http-subs-filter (1.13.6-2+wmf1) ...
Selecting previously unselected package libnginx-mod-http-upstream-fair.
Preparing to unpack .../07-libnginx-mod-http-upstream-fair_1.13.6-2+wmf1_amd64.deb ...
Unpacking libnginx-mod-http-upstream-fair (1.13.6-2+wmf1) ...
Selecting previously unselected package libnginx-mod-http-xslt-filter.
Preparing to unpack .../08-libnginx-mod-http-xslt-filter_1.13.6-2+wmf1_amd64.deb ...
Unpacking libnginx-mod-http-xslt-filter (1.13.6-2+wmf1) ...
Selecting previously unselected package libnginx-mod-mail.
Preparing to unpack .../09-libnginx-mod-mail_1.13.6-2+wmf1_amd64.deb ...
Unpacking libnginx-mod-mail (1.13.6-2+wmf1) ...
Selecting previously unselected package libnginx-mod-stream.
Preparing to unpack .../10-libnginx-mod-stream_1.13.6-2+wmf1_amd64.deb ...
Unpacking libnginx-mod-stream (1.13.6-2+wmf1) ...
Selecting previously unselected package nginx-full.
Preparing to unpack .../11-nginx-full_1.13.6-2+wmf1_amd64.deb ...
Unpacking nginx-full (1.13.6-2+wmf1) ...
Setting up nginx-common (1.13.6-2+wmf1) ...
Setting up libnginx-mod-http-image-filter (1.13.6-2+wmf1) ...
Setting up libnginx-mod-http-subs-filter (1.13.6-2+wmf1) ...
Processing triggers for systemd (232-25+deb9u4) ...
Setting up libnginx-mod-http-auth-pam (1.13.6-2+wmf1) ...
Setting up libnginx-mod-http-dav-ext (1.13.6-2+wmf1) ...
Setting up libnginx-mod-mail (1.13.6-2+wmf1) ...
Processing triggers for man-db (2.7.6.1-2) ...
Setting up libnginx-mod-http-xslt-filter (1.13.6-2+wmf1) ...
Setting up libnginx-mod-http-upstream-fair (1.13.6-2+wmf1) ...
Setting up libnginx-mod-http-geoip (1.13.6-2+wmf1) ...
Setting up libnginx-mod-stream (1.13.6-2+wmf1) ...
Setting up libnginx-mod-http-echo (1.13.6-2+wmf1) ...
Setting up nginx-full (1.13.6-2+wmf1) ...
Job for nginx.service failed because the control process exited with error code.
See "systemctl status nginx.service" and "journalctl -xe" for details.
invoke-rc.d: initscript nginx, action "start" failed.
* nginx.service - A high performance web server and a reverse proxy server
Loaded: loaded (/lib/systemd/system/nginx.service; enabled; vendor preset: enabled)
Active: failed (Result: exit-code) since Thu 2018-09-20 14:10:51 UTC; 19ms ago
Docs: man:nginx(8)
Process: 8415 ExecStart=/usr/sbin/nginx -g daemon on; master_process on; (code=exited, status=1/FAILURE)
Process: 8413 ExecStartPre=/usr/sbin/nginx -t -q -g daemon on; master_process on; (code=exited, status=0/SUCCESS)
Sep 20 14:10:50 puppet-paladox nginx[8415]: nginx: [emerg] bind() to [::]:80…se)
Sep 20 14:10:50 puppet-paladox nginx[8415]: nginx: [emerg] bind() to 0.0.0.0…se)
Sep 20 14:10:50 puppet-paladox nginx[8415]: nginx: [emerg] bind() to [::]:80…se)
Sep 20 14:10:51 puppet-paladox nginx[8415]: nginx: [emerg] bind() to 0.0.0.0…se)
Sep 20 14:10:51 puppet-paladox nginx[8415]: nginx: [emerg] bind() to [::]:80…se)
Sep 20 14:10:52 puppet-paladox nginx[8415]: nginx: [emerg] still could not b…d()
Sep 20 14:10:51 puppet-paladox systemd[1]: nginx.service: Control process ex…s=1
Sep 20 14:10:51 puppet-paladox systemd[1]: Failed to start A high performanc…er.
Sep 20 14:10:51 puppet-paladox systemd[1]: nginx.service: Unit entered faile…te.
Sep 20 14:10:51 puppet-paladox systemd[1]: nginx.service: Failed with result…e'.
Hint: Some lines were ellipsized, use -l to show in full.
dpkg: error processing package nginx-full (--configure):
subprocess installed post-installation script returned error exit status 1
Errors were encountered while processing:
nginx-full
E: Sub-process /usr/bin/dpkg returned an error code (1)
Error: /Stage[main]/Nginx/Package[nginx-full]: Could not evaluate: Puppet::Util::Log requires a message
Error: Execution of '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install nginx-common' returned 100: Reading package lists...
Building dependency tree...
Reading state information...
nginx-common is already the newest version (1.13.6-2+wmf1).
nginx-common set to manually installed.
0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded.
1 not fully installed or removed.
After this operation, 0 B of additional disk space will be used.
Setting up nginx-full (1.13.6-2+wmf1) ...
Job for nginx.service failed because the control process exited with error code.
See "systemctl status nginx.service" and "journalctl -xe" for details.
invoke-rc.d: initscript nginx, action "start" failed.
* nginx.service - A high performance web server and a reverse proxy server
Loaded: loaded (/lib/systemd/system/nginx.service; enabled; vendor preset: enabled)
Active: failed (Result: exit-code) since Thu 2018-09-20 14:10:55 UTC; 17ms ago
Docs: man:nginx(8)
Process: 8465 ExecStart=/usr/sbin/nginx -g daemon on; master_process on; (code=exited, status=1/FAILURE)
Process: 8462 ExecStartPre=/usr/sbin/nginx -t -q -g daemon on; master_process on; (code=exited, status=0/SUCCESS)
Sep 20 14:10:53 puppet-paladox nginx[8465]: nginx: [emerg] bind() to [::]:80…se)
Sep 20 14:10:54 puppet-paladox nginx[8465]: nginx: [emerg] bind() to 0.0.0.0…se)
Sep 20 14:10:54 puppet-paladox nginx[8465]: nginx: [emerg] bind() to [::]:80…se)
Sep 20 14:10:54 puppet-paladox nginx[8465]: nginx: [emerg] bind() to 0.0.0.0…se)
Sep 20 14:10:54 puppet-paladox nginx[8465]: nginx: [emerg] bind() to [::]:80…se)
Sep 20 14:10:55 puppet-paladox nginx[8465]: nginx: [emerg] still could not b…d()
Sep 20 14:10:55 puppet-paladox systemd[1]: nginx.service: Control process ex…s=1
Sep 20 14:10:55 puppet-paladox systemd[1]: Failed to start A high performanc…er.
Sep 20 14:10:55 puppet-paladox systemd[1]: nginx.service: Unit entered faile…te.
Sep 20 14:10:55 puppet-paladox systemd[1]: nginx.service: Failed with result…e'.
Hint: Some lines were ellipsized, use -l to show in full.
dpkg: error processing package nginx-full (--configure):
subprocess installed post-installation script returned error exit status 1
Errors were encountered while processing:
nginx-full
E: Sub-process /usr/bin/dpkg returned an error code (1)
Error: /Stage[main]/Nginx/Package[nginx-common]/ensure: change from absent to present failed: Execution of '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install nginx-common' returned 100: Reading package lists...
Building dependency tree...
Reading state information...
nginx-common is already the newest version (1.13.6-2+wmf1).
nginx-common set to manually installed.
0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded.
1 not fully installed or removed.
After this operation, 0 B of additional disk space will be used.
Setting up nginx-full (1.13.6-2+wmf1) ...
Job for nginx.service failed because the control process exited with error code.
See "systemctl status nginx.service" and "journalctl -xe" for details.
invoke-rc.d: initscript nginx, action "start" failed.
* nginx.service - A high performance web server and a reverse proxy server
Loaded: loaded (/lib/systemd/system/nginx.service; enabled; vendor preset: enabled)
Active: failed (Result: exit-code) since Thu 2018-09-20 14:10:55 UTC; 17ms ago
Docs: man:nginx(8)
Process: 8465 ExecStart=/usr/sbin/nginx -g daemon on; master_process on; (code=exited, status=1/FAILURE)
Process: 8462 ExecStartPre=/usr/sbin/nginx -t -q -g daemon on; master_process on; (code=exited, status=0/SUCCESS)
Sep 20 14:10:53 puppet-paladox nginx[8465]: nginx: [emerg] bind() to [::]:80…se)
Sep 20 14:10:54 puppet-paladox nginx[8465]: nginx: [emerg] bind() to 0.0.0.0…se)
Sep 20 14:10:54 puppet-paladox nginx[8465]: nginx: [emerg] bind() to [::]:80…se)
Sep 20 14:10:54 puppet-paladox nginx[8465]: nginx: [emerg] bind() to 0.0.0.0…se)
Sep 20 14:10:54 puppet-paladox nginx[8465]: nginx: [emerg] bind() to [::]:80…se)
Sep 20 14:10:55 puppet-paladox nginx[8465]: nginx: [emerg] still could not b…d()
Sep 20 14:10:55 puppet-paladox systemd[1]: nginx.service: Control process ex…s=1
Sep 20 14:10:55 puppet-paladox systemd[1]: Failed to start A high performanc…er.
Sep 20 14:10:55 puppet-paladox systemd[1]: nginx.service: Unit entered faile…te.
Sep 20 14:10:55 puppet-paladox systemd[1]: nginx.service: Failed with result…e'.
Hint: Some lines were ellipsized, use -l to show in full.
dpkg: error processing package nginx-full (--configure):
subprocess installed post-installation script returned error exit status 1
Errors were encountered while processing:
nginx-full
E: Sub-process /usr/bin/dpkg returned an error code (1)
Notice: /Stage[main]/Nginx/File[/etc/nginx/conf.d]: Dependency Package[nginx-full] has failures: true
Warning: /Stage[main]/Nginx/File[/etc/nginx/conf.d]: Skipping because of failed dependencies
Notice: /Stage[main]/Nginx/File[/etc/nginx/sites-available]: Dependency Package[nginx-full] has failures: true
Warning: /Stage[main]/Nginx/File[/etc/nginx/sites-available]: Skipping because of failed dependencies
Notice: /Stage[main]/Nginx/File[/etc/nginx/sites-available/default]: Dependency Package[nginx-full] has failures: true
Warning: /Stage[main]/Nginx/File[/etc/nginx/sites-available/default]: Skipping because of failed dependencies
Notice: /Stage[main]/Nginx/File[/etc/nginx/sites-enabled]: Dependency Package[nginx-full] has failures: true
Warning: /Stage[main]/Nginx/File[/etc/nginx/sites-enabled]: Skipping because of failed dependencies
Notice: /Stage[main]/Nginx/File[/etc/nginx/sites-enabled/default]: Dependency Package[nginx-full] has failures: true
Warning: /Stage[main]/Nginx/File[/etc/nginx/sites-enabled/default]: Skipping because of failed dependencies
Notice: /Stage[main]/Puppetmaster::Puppetdb/Nginx::Site[puppetdb]/File[/etc/nginx/sites-available/puppetdb]: Dependency Package[nginx-full] has failures: true
Warning: /Stage[main]/Puppetmaster::Puppetdb/Nginx::Site[puppetdb]/File[/etc/nginx/sites-available/puppetdb]: Skipping because of failed dependencies
Notice: /Stage[main]/Puppetmaster::Puppetdb/Nginx::Site[puppetdb]/File[/etc/nginx/sites-enabled/puppetdb]: Dependency Package[nginx-full] has failures: true
Warning: /Stage[main]/Puppetmaster::Puppetdb/Nginx::Site[puppetdb]/File[/etc/nginx/sites-enabled/puppetdb]: Skipping because of failed dependencies
Notice: /Stage[main]/Puppetdb::App/File[/etc/puppetdb]: Not removing directory; use 'force' to override
Notice: /Stage[main]/Puppetdb::App/File[/etc/puppetdb]: Not removing directory; use 'force' to override
Error: Could not remove existing file
Error: /Stage[main]/Puppetdb::App/File[/etc/puppetdb]/ensure: change from directory to link failed: Could not remove existing file
Notice: /Stage[main]/Puppetdb::App/File[/var/lib/puppetdb]/ensure: created
Notice: /Stage[main]/Puppetdb::App/File[/etc/default/puppetdb]/content:
--- /etc/default/puppetdb 2018-04-04 16:28:30.423045399 +0000
+++ /tmp/puppet-file20180920-6657-zljq99 2018-09-20 14:10:55.780000000 +0000
@@ -6,7 +6,7 @@
JAVA_BIN="/usr/bin/java"
# Modify this if you'd like to change the memory allocation, enable JMX, etc
-JAVA_ARGS="-Xmx192m"
+JAVA_ARGS="-Xmx4G -javaagent:/usr/share/java/prometheus/jmx_prometheus_javaagent.jar=172.16.1.179:9400:/etc/puppetdb/jvm_prometheus_puppetdb_jmx_exporter.yaml"
# These normally shouldn't need to be edited if using OS packages
USER="puppetdb"
Info: Computing checksum on file /etc/default/puppetdb
Info: /Stage[main]/Puppetdb::App/File[/etc/default/puppetdb]: Filebucketed /etc/default/puppetdb to puppet with sum 57cc3ba59c854fedac865a1600cacb47
Notice: /Stage[main]/Puppetdb::App/File[/etc/default/puppetdb]/content: content changed '{md5}57cc3ba59c854fedac865a1600cacb47' to '{md5}784c1ac03b669deb2d8c23d0cf9712cb'
Notice: /Stage[main]/Puppetdb::App/File[/etc/puppetdb/conf.d]: Dependency File[/etc/puppetdb] has failures: true
Warning: /Stage[main]/Puppetdb::App/File[/etc/puppetdb/conf.d]: Skipping because of failed dependencies
Notice: /Stage[main]/Puppetdb::App/File[/etc/puppetdb/conf.d/config.ini]: Dependency File[/etc/puppetdb] has failures: true
Warning: /Stage[main]/Puppetdb::App/File[/etc/puppetdb/conf.d/config.ini]: Skipping because of failed dependencies
Notice: /Stage[main]/Puppetdb::App/Puppetdb::Config[database]/File[/etc/puppetdb/conf.d/database.ini]: Dependency File[/etc/puppetdb] has failures: true
Warning: /Stage[main]/Puppetdb::App/Puppetdb::Config[database]/File[/etc/puppetdb/conf.d/database.ini]: Skipping because of failed dependencies
Notice: /Stage[main]/Puppetdb::App/Puppetdb::Config[read-database]/File[/etc/puppetdb/conf.d/read-database.ini]: Dependency File[/etc/puppetdb] has failures: true
Warning: /Stage[main]/Puppetdb::App/Puppetdb::Config[read-database]/File[/etc/puppetdb/conf.d/read-database.ini]: Skipping because of failed dependencies
Notice: /Stage[main]/Puppetdb::App/Puppetdb::Config[global]/File[/etc/puppetdb/conf.d/global.ini]: Dependency File[/etc/puppetdb] has failures: true
Warning: /Stage[main]/Puppetdb::App/Puppetdb::Config[global]/File[/etc/puppetdb/conf.d/global.ini]: Skipping because of failed dependencies
Notice: /Stage[main]/Puppetdb::App/Puppetdb::Config[repl]/File[/etc/puppetdb/conf.d/repl.ini]: Dependency File[/etc/puppetdb] has failures: true
Warning: /Stage[main]/Puppetdb::App/Puppetdb::Config[repl]/File[/etc/puppetdb/conf.d/repl.ini]: Skipping because of failed dependencies
Notice: /Stage[main]/Puppetdb::App/Base::Expose_puppet_certs[/etc/puppetdb]/File[/etc/puppetdb/ssl]: Dependency File[/etc/puppetdb] has failures: true
Warning: /Stage[main]/Puppetdb::App/Base::Expose_puppet_certs[/etc/puppetdb]/File[/etc/puppetdb/ssl]: Skipping because of failed dependencies
Notice: /Stage[main]/Puppetdb::App/Base::Expose_puppet_certs[/etc/puppetdb]/File[/etc/puppetdb/ssl/cert.pem]: Dependency File[/etc/puppetdb] has failures: true
Warning: /Stage[main]/Puppetdb::App/Base::Expose_puppet_certs[/etc/puppetdb]/File[/etc/puppetdb/ssl/cert.pem]: Skipping because of failed dependencies
Notice: /Stage[main]/Puppetdb::App/Base::Expose_puppet_certs[/etc/puppetdb]/File[/etc/puppetdb/ssl/server.key]: Dependency File[/etc/puppetdb] has failures: true
Warning: /Stage[main]/Puppetdb::App/Base::Expose_puppet_certs[/etc/puppetdb]/File[/etc/puppetdb/ssl/server.key]: Skipping because of failed dependencies
Notice: /Stage[main]/Puppetdb::App/Base::Expose_puppet_certs[/etc/puppetdb]/File[/etc/puppetdb/ssl/server-keypair.pem]: Dependency File[/etc/puppetdb] has failures: true
Warning: /Stage[main]/Puppetdb::App/Base::Expose_puppet_certs[/etc/puppetdb]/File[/etc/puppetdb/ssl/server-keypair.pem]: Skipping because of failed dependencies
Notice: /Stage[main]/Puppetdb::App/Puppetdb::Config[jetty]/File[/etc/puppetdb/conf.d/jetty.ini]: Dependency File[/etc/puppetdb] has failures: true
Warning: /Stage[main]/Puppetdb::App/Puppetdb::Config[jetty]/File[/etc/puppetdb/conf.d/jetty.ini]: Skipping because of failed dependencies
Notice: /Stage[main]/Puppetdb::App/Puppetdb::Config[command-processing]/File[/etc/puppetdb/conf.d/command-processing.ini]: Dependency File[/etc/puppetdb] has failures: true
Warning: /Stage[main]/Puppetdb::App/Puppetdb::Config[command-processing]/File[/etc/puppetdb/conf.d/command-processing.ini]: Skipping because of failed dependencies
Notice: /Stage[main]/Puppetdb::App/Service[puppetdb]: Dependency File[/etc/puppetdb] has failures: true
Warning: /Stage[main]/Puppetdb::App/Service[puppetdb]: Skipping because of failed dependencies
Info: Class[Puppetdb::App]: Unscheduling all events on Class[Puppetdb::App]
Notice: /Stage[main]/Profile::Puppetdb/Profile::Prometheus::Jmx_exporter[puppetdb_puppet-paladox]/File[/etc/puppetdb/jvm_prometheus_puppetdb_jmx_exporter.yaml]: Dependency File[/etc/puppetdb] has failures: true
Warning: /Stage[main]/Profile::Puppetdb/Profile::Prometheus::Jmx_exporter[puppetdb_puppet-paladox]/File[/etc/puppetdb/jvm_prometheus_puppetdb_jmx_exporter.yaml]: Skipping because of failed dependencies
Notice: /Stage[main]/Profile::Puppetdb/Ferm::Service[puppetdb]/File[/etc/ferm/conf.d/10_puppetdb]/content:
--- /etc/ferm/conf.d/10_puppetdb 2018-02-27 15:26:42.102609423 +0000
+++ /tmp/puppet-file20180920-6657-1wv6obs 2018-09-20 14:10:55.840000000 +0000
@@ -1,7 +1,7 @@
# Autogenerated by puppet. DO NOT EDIT BY HAND!
#
#
-&R_SERVICE(tcp, 443, @resolve((labs-puppetmaster.wikimedia.org)));
+&R_SERVICE(tcp, 443, @resolve(()));
&NO_TRACK(tcp, 443);
Info: Computing checksum on file /etc/ferm/conf.d/10_puppetdb
Info: /Stage[main]/Profile::Puppetdb/Ferm::Service[puppetdb]/File[/etc/ferm/conf.d/10_puppetdb]: Filebucketed /etc/ferm/conf.d/10_puppetdb to puppet with sum f39b151c3d8f07a87f2d4ec9309fea69
Notice: /Stage[main]/Profile::Puppetdb/Ferm::Service[puppetdb]/File[/etc/ferm/conf.d/10_puppetdb]/content: content changed '{md5}f39b151c3d8f07a87f2d4ec9309fea69' to '{md5}08989f2b5ae751ddd6e9dfd47554e96e'
Notice: /Stage[main]/Profile::Puppetdb/Ferm::Service[puppetdb]/File[/etc/ferm/conf.d/10_puppetdb]/group: group changed 'adm' to 'root'
Info: /Stage[main]/Profile::Puppetdb/Ferm::Service[puppetdb]/File[/etc/ferm/conf.d/10_puppetdb]: Scheduling refresh of Service[ferm]
Info: /Stage[main]/Profile::Puppetdb/Ferm::Service[puppetdb]/File[/etc/ferm/conf.d/10_puppetdb]: Scheduling refresh of Service[ferm]
Notice: /Stage[main]/Profile::Puppetdb/Ferm::Service[puppetdb-cumin]/File[/etc/ferm/conf.d/10_puppetdb-cumin]/group: group changed 'adm' to 'root'
Info: /Stage[main]/Profile::Puppetdb/Ferm::Service[puppetdb-cumin]/File[/etc/ferm/conf.d/10_puppetdb-cumin]: Scheduling refresh of Service[ferm]
Notice: /Stage[main]/Role::Puppetmaster::Standalone/Ferm::Service[puppetmaster-standalone]/File[/etc/ferm/conf.d/10_puppetmaster-standalone]/group: group changed 'adm' to 'root'
Info: /Stage[main]/Role::Puppetmaster::Standalone/Ferm::Service[puppetmaster-standalone]/File[/etc/ferm/conf.d/10_puppetmaster-standalone]: Scheduling refresh of Service[ferm]
Notice: /Stage[main]/Base::Firewall/Sysctl::Parameters[ferm_conntrack]/Sysctl::Conffile[ferm_conntrack]/File[/etc/sysctl.d/70-ferm_conntrack.conf]/ensure: defined content as '{md5}117ec150d2baac324d1e701edb170d3e'
Info: /Stage[main]/Base::Firewall/Sysctl::Parameters[ferm_conntrack]/Sysctl::Conffile[ferm_conntrack]/File[/etc/sysctl.d/70-ferm_conntrack.conf]: Scheduling refresh of Exec[update_sysctl]
Notice: /Stage[main]/Puppetmaster::Puppetdb::Database/Sysctl::Parameters[postgres_shmem]/Sysctl::Conffile[postgres_shmem]/File[/etc/sysctl.d/70-postgres_shmem.conf]/ensure: defined content as '{md5}47748dbf1c3959ac45753f6ed573fc9f'
Info: /Stage[main]/Puppetmaster::Puppetdb::Database/Sysctl::Parameters[postgres_shmem]/Sysctl::Conffile[postgres_shmem]/File[/etc/sysctl.d/70-postgres_shmem.conf]: Scheduling refresh of Exec[update_sysctl]
Notice: /Stage[main]/Sysctl/Exec[update_sysctl]: Triggered 'refresh' from 2 events
Notice: /Stage[main]/Puppetmaster::Puppetdb/Diamond::Collector::Nginx[puppet-paladox.git.eqiad.wmflabs]/Diamond::Collector[Nginx]/File[/etc/diamond/collectors/NginxCollector.conf]: Dependency Package[nginx-full] has failures: true
Warning: /Stage[main]/Puppetmaster::Puppetdb/Diamond::Collector::Nginx[puppet-paladox.git.eqiad.wmflabs]/Diamond::Collector[Nginx]/File[/etc/diamond/collectors/NginxCollector.conf]: Skipping because of failed dependencies
Notice: /Stage[main]/Profile::Puppetdb/Profile::Prometheus::Jmx_exporter[puppetdb_puppet-paladox]/Ferm::Service[puppetdb_puppet-paladox_jmx_exporter]/File[/etc/ferm/conf.d/10_puppetdb_puppet-paladox_jmx_exporter]/group: group changed 'adm' to 'root'
Info: /Stage[main]/Profile::Puppetdb/Profile::Prometheus::Jmx_exporter[puppetdb_puppet-paladox]/Ferm::Service[puppetdb_puppet-paladox_jmx_exporter]/File[/etc/ferm/conf.d/10_puppetdb_puppet-paladox_jmx_exporter]: Scheduling refresh of Service[ferm]
Notice: /Stage[main]/Ferm/Service[ferm]: Triggered 'refresh' from 19 events
Info: Profile::Prometheus::Jmx_exporter[puppetdb_puppet-paladox]: Unscheduling all events on Profile::Prometheus::Jmx_exporter[puppetdb_puppet-paladox]
Info: Class[Profile::Puppetdb]: Unscheduling all events on Class[Profile::Puppetdb]
Notice: /Stage[main]/Role::Puppetmaster::Puppetdb/System::Role[puppetmaster::puppetdb (postgres master)]/Motd::Script[role-puppetmaster::puppetdb (postgres master)]/File[/etc/update-motd.d/05-role-puppetmaster--puppetdb--postgres-master-]/ensure: defined content as '{md5}78e1e7222eef081b954e7c6366dfa6a3'
Notice: /Stage[main]/Diamond/Service[diamond]: Dependency Package[nginx-full] has failures: true
Warning: /Stage[main]/Diamond/Service[diamond]: Skipping because of failed dependencies
Notice: /Stage[main]/Puppetmaster::Puppetdb/Diamond::Collector::Nginx[puppet-paladox.git.eqiad.wmflabs]/Nginx::Status_site[status]/Nginx::Site[status]/File[/etc/nginx/sites-available/status]: Dependency Package[nginx-full] has failures: true
Warning: /Stage[main]/Puppetmaster::Puppetdb/Diamond::Collector::Nginx[puppet-paladox.git.eqiad.wmflabs]/Nginx::Status_site[status]/Nginx::Site[status]/File[/etc/nginx/sites-available/status]: Skipping because of failed dependencies
Notice: /Stage[main]/Nginx/Exec[nginx-reload]: Dependency Package[nginx-full] has failures: true
Warning: /Stage[main]/Nginx/Exec[nginx-reload]: Skipping because of failed dependencies
Notice: /Stage[main]/Puppetmaster::Puppetdb/Diamond::Collector::Nginx[puppet-paladox.git.eqiad.wmflabs]/Nginx::Status_site[status]/Nginx::Site[status]/File[/etc/nginx/sites-enabled/status]: Dependency Package[nginx-full] has failures: true
Warning: /Stage[main]/Puppetmaster::Puppetdb/Diamond::Collector::Nginx[puppet-paladox.git.eqiad.wmflabs]/Nginx::Status_site[status]/Nginx::Site[status]/File[/etc/nginx/sites-enabled/status]: Skipping because of failed dependencies
Notice: /Stage[main]/Nginx/Service[nginx]: Dependency Package[nginx-full] has failures: true
Warning: /Stage[main]/Nginx/Service[nginx]: Skipping because of failed dependencies
Notice: /Stage[main]/Puppetmaster::Puppetdb/Base::Expose_puppet_certs[/etc/nginx]/File[/etc/nginx/ssl]: Dependency Package[nginx-full] has failures: true
Notice: /Stage[main]/Puppetmaster::Puppetdb/Base::Expose_puppet_certs[/etc/nginx]/File[/etc/nginx/ssl]: Dependency Package[nginx-common] has failures: true
Warning: /Stage[main]/Puppetmaster::Puppetdb/Base::Expose_puppet_certs[/etc/nginx]/File[/etc/nginx/ssl]: Skipping because of failed dependencies
Notice: /Stage[main]/Puppetmaster::Puppetdb/Base::Expose_puppet_certs[/etc/nginx]/File[/etc/nginx/ssl/cert.pem]: Dependency Package[nginx-full] has failures: true
Notice: /Stage[main]/Puppetmaster::Puppetdb/Base::Expose_puppet_certs[/etc/nginx]/File[/etc/nginx/ssl/cert.pem]: Dependency Package[nginx-common] has failures: true
Warning: /Stage[main]/Puppetmaster::Puppetdb/Base::Expose_puppet_certs[/etc/nginx]/File[/etc/nginx/ssl/cert.pem]: Skipping because of failed dependencies
Notice: /Stage[main]/Puppetmaster::Puppetdb/Base::Expose_puppet_certs[/etc/nginx]/File[/etc/nginx/ssl/server.key]: Dependency Package[nginx-full] has failures: true
Notice: /Stage[main]/Puppetmaster::Puppetdb/Base::Expose_puppet_certs[/etc/nginx]/File[/etc/nginx/ssl/server.key]: Dependency Package[nginx-common] has failures: true
Warning: /Stage[main]/Puppetmaster::Puppetdb/Base::Expose_puppet_certs[/etc/nginx]/File[/etc/nginx/ssl/server.key]: Skipping because of failed dependencies
Notice: /Stage[main]/Puppetmaster::Puppetdb/Base::Expose_puppet_certs[/etc/nginx]/File[/etc/nginx/ssl/server-keypair.pem]: Dependency Package[nginx-full] has failures: true
Notice: /Stage[main]/Puppetmaster::Puppetdb/Base::Expose_puppet_certs[/etc/nginx]/File[/etc/nginx/ssl/server-keypair.pem]: Dependency Package[nginx-common] has failures: true
Warning: /Stage[main]/Puppetmaster::Puppetdb/Base::Expose_puppet_certs[/etc/nginx]/File[/etc/nginx/ssl/server-keypair.pem]: Skipping because of failed dependencies
Info: Stage[main]: Unscheduling all events on Stage[main]
Notice: Applied catalog in 20.58 seconds