Page MenuHomePhabricator

tools-nfs outage 2024-11-25
Closed, ResolvedPublic

Description

Just now I noticed a non-paging alert about a puppet failure on tools-sgebastion-10. When investigating I discovered that nfs was frozen for that host; I checked tools-bastion-13 and it too could not access project or user NFS.

$ systemctl restart nfs-server

on tools-nfs-2.tools.eqiad1.wikimedia.cloud seems to have resolved the issue, but I don't have any theory of cause.

I first logged in to investigate at 04:44:16 but I see @Anomie on IRC complaining about not being able to log in to sgebastion-10 starting earlier, around 04:00.

So we have two mysteries: why NFS stopped responding, and why this didn't alert.

Related Objects

StatusSubtypeAssignedTask
OpenNone
Resolvedaborrero
OpenNone
ResolvedNone
OpenNone
Resolveddcaro
ResolvedNone
Resolveddcaro
Resolveddcaro
ResolvedRaymond_Ndibe
ResolvedRaymond_Ndibe
ResolvedRaymond_Ndibe
ResolvedRaymond_Ndibe
ResolvedRaymond_Ndibe
ResolvedRaymond_Ndibe
Resolveddcaro
Resolveddcaro
In Progressdcaro
In Progressdcaro

Event Timeline

Restricted Application added a subscriber: Aklapper. · View Herald Transcript

I'm rebooting nfs nodes via the cookbook. Multiple people are seeing intermittent dns errors; I'm not sure how they can be related but this seems like a good first step.

aborrero subscribed.

Regarding why NFS stopped responding, I did some quick research.

I can see some log entries:

Nov 26 02:27:50 tools-sgebastion-10 kernel: [9562349.633512] nfs: server tools-nfs.svc.tools.eqiad1.wikimedia.cloud not responding, timed out
Nov 26 02:30:50 tools-sgebastion-10 kernel: [9562529.852418] nfs: server tools-nfs.svc.tools.eqiad1.wikimedia.cloud not responding, timed out
Nov 26 02:32:20 tools-sgebastion-10 kernel: [9562619.961935] nfs: server tools-nfs.svc.tools.eqiad1.wikimedia.cloud not responding, timed out
Nov 26 02:33:51 tools-sgebastion-10 kernel: [9562710.071492] nfs: server tools-nfs.svc.tools.eqiad1.wikimedia.cloud not responding, still trying

But in tools-bastion-13 I could not find anything.

There is nothing special in the logs tools-nfs-2 either.

So my theory is maybe a ceph network hiccup?

A couple of minutes before the nfs server was reported as not responding, the neutron-openvswith-agent running on the cloudvirt hosting the nfs server had a problem:

Nov 26 02:23:35 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:23:35.225 2423439 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:23:35Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
Nov 26 02:23:35 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:23:35.226 2423439 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
Nov 26 02:23:58 cloudvirt1050 ovs-vsctl[2502996]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"

And a few minutes before that, there was some kind of connectivity issue with rabbitmq:

Nov 26 02:00:34 cloudvirt1050 nova-compute[2423043]: 2024-11-26 02:00:34.841 2423043 ERROR oslo.messaging._drivers.impl_rabbit [-] [439d99a7-3820-4d5c-99b9-e9ab4152509f] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: EOF occurred in violation of protocol (_ssl.c:2393). Trying again in 0 seconds.: ssl.SSLEOFError: EOF occurred in violati>
Nov 26 02:00:34 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:34.913 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [b4d09f1f-184c-4d9a-8a60-344f47a50549] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: (0, 0): (320) CONNECTION_FORCED - broker forced connection closure with reason 'shutdown'. Trying again in 0>
Nov 26 02:00:34 cloudvirt1050 nova-compute[2423043]: 2024-11-26 02:00:34.921 2423043 ERROR oslo.messaging._drivers.impl_rabbit [-] [3b99f279-5320-4d7b-9bc7-4e8a01785a1e] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: (0, 0): (320) CONNECTION_FORCED - broker forced connection closure with reason 'shutdown'. Trying again in 0 seconds.: am>
Nov 26 02:00:34 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:34.924 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [01623f62-7676-43d8-a131-b60a94c58b0b] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: (0, 0): (320) CONNECTION_FORCED - broker forced connection closure with reason 'shutdown'. Trying again in 0>
Nov 26 02:00:34 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:34.926 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [a1098e0a-1fed-42ee-b789-b97b0dd902be] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: (0, 0): (320) CONNECTION_FORCED - broker forced connection closure with reason 'shutdown'. Trying again in 0>
Nov 26 02:00:34 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:34.927 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [c26a5ab2-ae0e-495c-ac01-e177f57b24bb] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: (0, 0): (320) CONNECTION_FORCED - broker forced connection closure with reason 'shutdown'. Trying again in 0>
Nov 26 02:00:34 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:34.929 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [815db45a-8134-4e2a-9b11-4d090ab40b91] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: (0, 0): (320) CONNECTION_FORCED - broker forced connection closure with reason 'shutdown'. Trying again in 0>
Nov 26 02:00:34 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:34.932 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [900b1a8f-4b49-4951-a886-53b1ac474b66] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: (0, 0): (320) CONNECTION_FORCED - broker forced connection closure with reason 'shutdown'. Trying again in 0>
Nov 26 02:00:34 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:34.936 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [324ba1cc-36d0-490e-acfd-ebff6e93e748] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: (0, 0): (320) CONNECTION_FORCED - broker forced connection closure with reason 'shutdown'. Trying again in 0>
Nov 26 02:00:34 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:34.938 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [61b0ffa4-0a94-4923-b078-3e2f932fbd64] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: (0, 0): (320) CONNECTION_FORCED - broker forced connection closure with reason 'shutdown'. Trying again in 0>
Nov 26 02:00:34 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:34.943 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [f4401bc3-9945-4c21-b723-398a9fb9f85d] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: (0, 0): (320) CONNECTION_FORCED - broker forced connection closure with reason 'shutdown'. Trying again in 0>
Nov 26 02:00:34 cloudvirt1050 nova-compute[2423043]: 2024-11-26 02:00:34.948 2423043 ERROR oslo.messaging._drivers.impl_rabbit [-] [439d99a7-3820-4d5c-99b9-e9ab4152509f] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 0 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
Nov 26 02:00:34 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:34.982 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [33ad3491-8fe6-407a-bcdd-acac828c28f7] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: EOF occurred in violation of protocol (_ssl.c:2393). Trying again in 0 seconds.: ssl.SSLEOFError: EOF occurr>
Nov 26 02:00:34 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:34.984 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [5b12142b-2119-430c-a9b5-9661969cf03f] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: (0, 0): (320) CONNECTION_FORCED - broker forced connection closure with reason 'shutdown'. Trying again in 0>
Nov 26 02:00:34 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:34.986 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [cad952e8-247b-491e-a4ff-e3988ca340a9] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: (0, 0): (320) CONNECTION_FORCED - broker forced connection closure with reason 'shutdown'. Trying again in 0>
Nov 26 02:00:35 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:35.020 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [b4d09f1f-184c-4d9a-8a60-344f47a50549] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 0 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
Nov 26 02:00:35 cloudvirt1050 nova-compute[2423043]: 2024-11-26 02:00:35.025 2423043 ERROR oslo.messaging._drivers.impl_rabbit [-] [3b99f279-5320-4d7b-9bc7-4e8a01785a1e] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 0 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
Nov 26 02:00:35 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:35.027 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [01623f62-7676-43d8-a131-b60a94c58b0b] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 0 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
Nov 26 02:00:35 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:35.030 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [a1098e0a-1fed-42ee-b789-b97b0dd902be] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 0 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
Nov 26 02:00:35 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:35.031 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [c26a5ab2-ae0e-495c-ac01-e177f57b24bb] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 0 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
Nov 26 02:00:35 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:35.034 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [815db45a-8134-4e2a-9b11-4d090ab40b91] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 0 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
Nov 26 02:00:35 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:35.036 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [900b1a8f-4b49-4951-a886-53b1ac474b66] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 0 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
Nov 26 02:00:35 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:35.039 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [324ba1cc-36d0-490e-acfd-ebff6e93e748] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 0 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
Nov 26 02:00:35 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:35.041 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [61b0ffa4-0a94-4923-b078-3e2f932fbd64] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 0 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
Nov 26 02:00:35 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:35.046 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [f4401bc3-9945-4c21-b723-398a9fb9f85d] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 0 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
Nov 26 02:00:35 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:35.085 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [33ad3491-8fe6-407a-bcdd-acac828c28f7] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 0 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
Nov 26 02:00:35 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:35.087 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [5b12142b-2119-430c-a9b5-9661969cf03f] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 0 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
Nov 26 02:00:35 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:35.089 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] [cad952e8-247b-491e-a4ff-e3988ca340a9] AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 0 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
Nov 26 02:00:43 cloudvirt1050 nova-compute[2423043]: 2024-11-26 02:00:43.547 2423043 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
Nov 26 02:00:45 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:00:45.522 2423439 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED

So my theory is maybe a ceph network hiccup?

There's no traffic interruption, errors spike or drops spike on the (cloudsw) switches, nor flips on the ceph health/degraded objects dashboard. If it was ceph, it was not a general blip, but something related to a specific rados object.

supporting the theory of a some kind of general openstack network problems, openvswitch failed in pretty much all the cloudvirts more or less at the same time:

1
2aborrero@cloudcumin1001:~ $ sudo cumin cloudvirt1* 'journalctl --prio=err --since 2024-11-23 | grep ovs'
337 hosts will be targeted:
4cloudvirt[1031-1067].eqiad.wmnet
5OK to proceed on 37 hosts? Enter the number of affected hosts to confirm or "q" to quit: 37
6===== NODE GROUP =====
7(1) cloudvirt1064.eqiad.wmnet
8----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
9Nov 26 02:14:55 cloudvirt1064 neutron-openvswitch-agent[4085755]: 2024-11-26 02:14:55.381 4085755 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:14:55Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
10Nov 26 02:14:55 cloudvirt1064 neutron-openvswitch-agent[4085755]: 2024-11-26 02:14:55.382 4085755 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
11Nov 26 02:15:19 cloudvirt1064 ovs-vsctl[1485208]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
12Nov 26 02:45:45 cloudvirt1064 neutron-openvswitch-agent[1484025]: 2024-11-26 02:45:45.317 1484025 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:45:45Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
13Nov 26 02:45:45 cloudvirt1064 neutron-openvswitch-agent[1484025]: 2024-11-26 02:45:45.318 1484025 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
14Nov 26 02:46:01 cloudvirt1064 ovs-vsctl[1502839]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
15===== NODE GROUP =====
16(1) cloudvirt1043.eqiad.wmnet
17----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
18Nov 26 02:23:08 cloudvirt1043 neutron-openvswitch-agent[3183565]: 2024-11-26 02:23:08.015 3183565 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:23:08Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
19Nov 26 02:23:08 cloudvirt1043 neutron-openvswitch-agent[3183565]: 2024-11-26 02:23:08.015 3183565 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
20Nov 26 02:23:29 cloudvirt1043 ovs-vsctl[3731568]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
21===== NODE GROUP =====
22(1) cloudvirt1063.eqiad.wmnet
23----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
24Nov 23 03:27:13 cloudvirt1063 neutron-openvswitch-agent[1855]: 2024-11-23 03:27:13.267 1855 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a4b44b69-4a60-4bb9-b514-28cbc145f311 - - - - - -] VIF port: 8bcb7178-3df7-4b4d-a2fa-d11de176e618 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
25Nov 23 11:07:51 cloudvirt1063 neutron-openvswitch-agent[1855]: 2024-11-23 11:07:51.882 1855 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a4b44b69-4a60-4bb9-b514-28cbc145f311 - - - - - -] VIF port: ff280876-3c06-4604-87ed-8b2a047821dc has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
26Nov 23 11:58:28 cloudvirt1063 neutron-openvswitch-agent[1855]: 2024-11-23 11:58:28.130 1855 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a4b44b69-4a60-4bb9-b514-28cbc145f311 - - - - - -] VIF port: 2b1a85ba-9dd5-4dff-8c6d-bd4b723148b3 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
27Nov 23 21:11:32 cloudvirt1063 neutron-openvswitch-agent[1855]: 2024-11-23 21:11:32.688 1855 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a4b44b69-4a60-4bb9-b514-28cbc145f311 - - - - - -] VIF port: f30400df-8d57-4be7-a67f-08b453403298 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
28Nov 24 03:48:12 cloudvirt1063 neutron-openvswitch-agent[1855]: 2024-11-24 03:48:12.480 1855 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a4b44b69-4a60-4bb9-b514-28cbc145f311 - - - - - -] VIF port: aa343884-9798-4ca2-a406-8fd350b89b5c has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
29Nov 24 12:09:50 cloudvirt1063 neutron-openvswitch-agent[1855]: 2024-11-24 12:09:50.857 1855 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a4b44b69-4a60-4bb9-b514-28cbc145f311 - - - - - -] VIF port: 2a387865-5e5a-4887-8187-4cce372858fa has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
30Nov 24 21:03:36 cloudvirt1063 neutron-openvswitch-agent[1855]: 2024-11-24 21:03:36.684 1855 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a4b44b69-4a60-4bb9-b514-28cbc145f311 - - - - - -] VIF port: 240258c6-dedf-4a18-94a9-cf2e7bb5b81f has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
31Nov 24 22:12:35 cloudvirt1063 neutron-openvswitch-agent[1855]: 2024-11-24 22:12:35.698 1855 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a4b44b69-4a60-4bb9-b514-28cbc145f311 - - - - - -] VIF port: 48e59ef5-a373-4423-99c0-b2d0fd5f56aa has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
32Nov 25 01:02:41 cloudvirt1063 neutron-openvswitch-agent[1855]: 2024-11-25 01:02:41.387 1855 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a4b44b69-4a60-4bb9-b514-28cbc145f311 - - - - - -] VIF port: 80815f79-f9bb-4c23-b2d8-6b2ca33e9a73 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
33Nov 25 01:53:17 cloudvirt1063 neutron-openvswitch-agent[1855]: 2024-11-25 01:53:17.732 1855 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a4b44b69-4a60-4bb9-b514-28cbc145f311 - - - - - -] VIF port: 30b493dc-9e9c-4002-8671-46f4f96b7715 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
34Nov 25 05:52:18 cloudvirt1063 neutron-openvswitch-agent[1855]: 2024-11-25 05:52:18.560 1855 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a4b44b69-4a60-4bb9-b514-28cbc145f311 - - - - - -] VIF port: 50b67405-4b36-4416-bdce-a2e388013825 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
35Nov 25 07:01:21 cloudvirt1063 neutron-openvswitch-agent[1855]: 2024-11-25 07:01:21.611 1855 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a4b44b69-4a60-4bb9-b514-28cbc145f311 - - - - - -] VIF port: d2b012dc-15a2-4b01-83ad-21bf1ae91074 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
36Nov 25 07:14:04 cloudvirt1063 neutron-openvswitch-agent[1855]: 2024-11-25 07:14:04.187 1855 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a4b44b69-4a60-4bb9-b514-28cbc145f311 - - - - - -] VIF port: 096ffb75-56db-499d-97fd-7418a23601f1 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
37Nov 25 21:45:55 cloudvirt1063 neutron-openvswitch-agent[1855]: 2024-11-25 21:45:55.203 1855 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a4b44b69-4a60-4bb9-b514-28cbc145f311 - - - - - -] VIF port: 51d97f72-3690-48ea-a863-bba83cb95d01 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
38Nov 26 02:07:02 cloudvirt1063 neutron-openvswitch-agent[1855]: 2024-11-26 02:07:02.807 1855 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:07:02Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
39Nov 26 02:07:02 cloudvirt1063 neutron-openvswitch-agent[1855]: 2024-11-26 02:07:02.808 1855 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
40Nov 26 02:07:22 cloudvirt1063 ovs-vsctl[2940648]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
41Nov 26 02:37:37 cloudvirt1063 neutron-openvswitch-agent[2939483]: 2024-11-26 02:37:37.994 2939483 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:37:37Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
42Nov 26 02:37:37 cloudvirt1063 neutron-openvswitch-agent[2939483]: 2024-11-26 02:37:37.997 2939483 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
43Nov 26 02:37:53 cloudvirt1063 ovs-vsctl[2958790]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
44===== NODE GROUP =====
45(1) cloudvirt1042.eqiad.wmnet
46----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
47Nov 26 02:01:53 cloudvirt1042 neutron-openvswitch-agent[3444369]: 2024-11-26 02:01:53.493 3444369 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:01:53Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
48Nov 26 02:01:53 cloudvirt1042 neutron-openvswitch-agent[3444369]: 2024-11-26 02:01:53.494 3444369 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
49Nov 26 02:02:15 cloudvirt1042 ovs-vsctl[3987074]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
50Nov 26 02:02:16 cloudvirt1042 neutron-openvswitch-agent[3985786]: 2024-11-26 02:02:16.940 3985786 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-7372c221-8ea0-4e16-a130-73956ebe511f - - - - - -] VIF port: be130cdc-18ac-4005-b416-72ae9934855c has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
51Nov 26 02:02:33 cloudvirt1042 neutron-openvswitch-agent[3985786]: 2024-11-26 02:02:33.616 3985786 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-7372c221-8ea0-4e16-a130-73956ebe511f - - - - - -] VIF port: be130cdc-18ac-4005-b416-72ae9934855c has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
52Nov 26 02:31:55 cloudvirt1042 neutron-openvswitch-agent[3985786]: 2024-11-26 02:31:55.251 3985786 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:31:55Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
53Nov 26 02:31:55 cloudvirt1042 neutron-openvswitch-agent[3985786]: 2024-11-26 02:31:55.252 3985786 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
54Nov 26 02:32:05 cloudvirt1042 ovs-vsctl[4003995]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
55Nov 26 02:32:07 cloudvirt1042 neutron-openvswitch-agent[4002472]: 2024-11-26 02:32:07.661 4002472 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-53e7090f-c390-4a90-ae7b-7f82641b803f - - - - - -] VIF port: be130cdc-18ac-4005-b416-72ae9934855c has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
56Nov 26 02:32:23 cloudvirt1042 neutron-openvswitch-agent[4002472]: 2024-11-26 02:32:23.915 4002472 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-53e7090f-c390-4a90-ae7b-7f82641b803f - - - - - -] VIF port: be130cdc-18ac-4005-b416-72ae9934855c has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
57===== NODE GROUP =====
58(1) cloudvirt1062.eqiad.wmnet
59----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
60Nov 23 02:18:57 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-23 02:18:57.874 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: 403328f8-2832-4d13-852a-9d7777035895 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
61Nov 23 05:04:49 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-23 05:04:49.437 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: b1bc83f8-2fad-434f-9ddd-ab087178ff98 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
62Nov 23 05:15:15 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-23 05:15:15.918 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: 0c2485c2-9e39-4569-a125-c666358ed217 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
63Nov 23 05:36:18 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-23 05:36:18.915 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: c390f3e8-d2b8-4478-9fe7-2be93f4fb331 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
64Nov 23 05:57:27 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-23 05:57:27.881 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: 746327d8-3726-42a0-a162-a17b2a7d1207 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
65Nov 23 12:00:18 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-23 12:00:18.504 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: 37ed330a-65a6-42c7-b71f-78b2b7e9ec58 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
66Nov 23 15:33:40 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-23 15:33:40.344 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: c9ba2fef-fa7c-457d-bf0d-a57d514b5ca8 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
67Nov 23 17:26:15 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-23 17:26:15.534 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: 5ab9441a-be17-44eb-83a2-0ebf62938f20 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
68Nov 23 19:39:09 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-23 19:39:09.571 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: e6b7e940-471c-4ab3-8071-351bde1ad992 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
69Nov 23 20:00:18 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-23 20:00:18.497 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: ada97325-4b2f-44fa-b637-d92b462a0a65 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
70Nov 24 13:30:56 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-24 13:30:56.804 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: 88f8f310-7f8f-42e8-8e34-9c8553f5503f has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
71Nov 24 20:03:12 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-24 20:03:12.890 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: 9223c437-829b-45d1-bb0f-7cd5c8ea8e38 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
72Nov 24 22:37:50 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-24 22:37:50.115 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: 49f9d406-5129-4d6d-b294-fe66391d81b5 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
73Nov 24 22:50:22 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-24 22:50:22.679 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: 46d544d8-2c27-47d8-a78e-b2610087863a has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
74Nov 25 00:06:10 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-25 00:06:10.184 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: 5600c1c4-678a-4c70-8030-fc99d61d434d has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
75Nov 25 02:05:47 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-25 02:05:47.796 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: 7ab6c872-1cad-4481-adea-55602d6a3669 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
76Nov 25 06:36:16 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-25 06:36:16.391 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: b062ade8-71b9-4fc8-873e-0580791def98 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
77Nov 25 09:57:35 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-25 09:57:35.981 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: 99e40e05-90c5-4c2f-8f8b-5d3abf7bf8b9 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
78Nov 25 10:35:33 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-25 10:35:33.711 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: 5c71a9b4-8995-4511-9526-27f54cef6339 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
79Nov 25 12:00:11 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-25 12:00:11.583 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: d152d69e-73f8-40ff-a21e-0a4be09519bc has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
80Nov 25 18:00:22 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-25 18:00:22.180 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: 28dfb6bd-7e29-4987-bf46-6c4a2f9f9792 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
81Nov 25 18:42:04 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-25 18:42:04.079 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: aa809336-1386-4fa2-8eec-ffc063ca1fcc has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
82Nov 26 00:12:31 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-26 00:12:31.049 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: 0411636c-ce17-45dc-9609-0f2321566d6f has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
83Nov 26 01:43:35 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-26 01:43:35.098 1861 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6058d939-c9c8-4dbc-acea-8c1402c95437 - - - - - -] VIF port: 22720895-805b-42e5-94f0-cce819e23160 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
84Nov 26 02:21:27 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-26 02:21:27.234 1861 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:21:27Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
85Nov 26 02:21:27 cloudvirt1062 neutron-openvswitch-agent[1861]: 2024-11-26 02:21:27.234 1861 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
86Nov 26 02:21:48 cloudvirt1062 ovs-vsctl[1677467]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
87Nov 26 06:07:05 cloudvirt1062 neutron-openvswitch-agent[1676320]: 2024-11-26 06:07:05.909 1676320 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-944b0efe-8931-4df5-a6a4-a9221c659a0a - - - - - -] VIF port: f9c47d28-7978-456d-afc4-d0be53398de4 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
88Nov 26 10:11:28 cloudvirt1062 neutron-openvswitch-agent[1676320]: 2024-11-26 10:11:28.799 1676320 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-944b0efe-8931-4df5-a6a4-a9221c659a0a - - - - - -] VIF port: 466b6401-9325-45ee-ab1d-8e41225205aa has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
89===== NODE GROUP =====
90(1) cloudvirt1061.eqiad.wmnet
91----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
92Nov 26 02:18:14 cloudvirt1061 neutron-openvswitch-agent[3306465]: 2024-11-26 02:18:14.710 3306465 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:18:14Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
93Nov 26 02:18:14 cloudvirt1061 neutron-openvswitch-agent[3306465]: 2024-11-26 02:18:14.711 3306465 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
94Nov 26 02:18:37 cloudvirt1061 ovs-vsctl[3362300]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
95===== NODE GROUP =====
96(1) cloudvirt1041.eqiad.wmnet
97----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
98Nov 26 02:18:45 cloudvirt1041 neutron-openvswitch-agent[368018]: 2024-11-26 02:18:45.768 368018 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:18:45Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
99Nov 26 02:18:45 cloudvirt1041 neutron-openvswitch-agent[368018]: 2024-11-26 02:18:45.769 368018 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
100Nov 26 02:19:06 cloudvirt1041 ovs-vsctl[550760]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
101===== NODE GROUP =====
102(1) cloudvirt1040.eqiad.wmnet
103----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
104Nov 26 02:08:24 cloudvirt1040 neutron-openvswitch-agent[337445]: 2024-11-26 02:08:24.314 337445 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:08:24Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
105Nov 26 02:08:24 cloudvirt1040 neutron-openvswitch-agent[337445]: 2024-11-26 02:08:24.316 337445 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
106Nov 26 02:08:45 cloudvirt1040 ovs-vsctl[876025]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
107Nov 26 02:38:09 cloudvirt1040 neutron-openvswitch-agent[874864]: 2024-11-26 02:38:09.767 874864 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:38:09Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
108Nov 26 02:38:09 cloudvirt1040 neutron-openvswitch-agent[874864]: 2024-11-26 02:38:09.767 874864 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
109Nov 26 02:38:26 cloudvirt1040 ovs-vsctl[892783]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
110===== NODE GROUP =====
111(1) cloudvirt1060.eqiad.wmnet
112----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
113Nov 26 02:24:09 cloudvirt1060 neutron-openvswitch-agent[1390856]: 2024-11-26 02:24:09.774 1390856 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:24:09Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
114Nov 26 02:24:09 cloudvirt1060 neutron-openvswitch-agent[1390856]: 2024-11-26 02:24:09.774 1390856 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
115Nov 26 02:24:32 cloudvirt1060 ovs-vsctl[1426820]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
116Nov 26 02:24:33 cloudvirt1060 neutron-openvswitch-agent[1425644]: 2024-11-26 02:24:33.730 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-63068236-b65a-4d08-a061-354b153cc156 - - - - - -] VIF port: a5b3ed19-0d9f-44f3-8d32-039cbfd953a6 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
117Nov 26 02:24:46 cloudvirt1060 neutron-openvswitch-agent[1425644]: 2024-11-26 02:24:46.141 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-63068236-b65a-4d08-a061-354b153cc156 - - - - - -] VIF port: a5b3ed19-0d9f-44f3-8d32-039cbfd953a6 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
118Nov 26 11:34:01 cloudvirt1060 neutron-openvswitch-agent[1425644]: 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [-] Failed reporting state!: oslo_messaging.exceptions.MessageDeliveryFailure: Unable to connect to AMQP server on rabbitmq02.eqiad1.wikimediacloud.org:5671 after inf tries:
119 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent Traceback (most recent call last):
120 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/impl_rabbit.py", line 1005, in ensure
121 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ret, channel = autoretry_method()
122 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^
123 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/kombu/connection.py", line 523, in _ensured
124 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return fun(*args, **kwargs)
125 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^
126 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/kombu/connection.py", line 599, in __call__
127 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return fun(*args, channel=channels[0], **kwargs), channels[0]
128 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
129 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/impl_rabbit.py", line 994, in execute_method
130 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent method()
131 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/impl_rabbit.py", line 1432, in _publish
132 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent self._producer.publish(
133 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/kombu/messaging.py", line 177, in publish
134 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return _publish(
135 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^
136 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/kombu/messaging.py", line 199, in _publish
137 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return channel.basic_publish(
138 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^
139 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/channel.py", line 1820, in basic_publish_confirm
140 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent self.wait([spec.Basic.Ack, spec.Basic.Nack],
141 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/abstract_channel.py", line 99, in wait
142 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent self.connection.drain_events(timeout=timeout)
143 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/connection.py", line 525, in drain_events
144 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent while not self.blocking_read(timeout):
145 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^
146 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/connection.py", line 531, in blocking_read
147 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return self.on_inbound_frame(frame)
148 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
149 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/method_framing.py", line 53, in on_frame
150 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent callback(channel, method_sig, buf, None)
151 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/connection.py", line 537, in on_inbound_method
152 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return self.channels[channel_id].dispatch_method(
153 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
154 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/abstract_channel.py", line 159, in dispatch_method
155 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent one_shot(method_sig, *args)
156 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/vine/promises.py", line 160, in __call__
157 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return self.throw()
158 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^
159 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/vine/promises.py", line 157, in __call__
160 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent retval = fun(*final_args, **final_kwargs)
161 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
162 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/channel.py", line 1812, in confirm_handler
163 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent raise MessageNacked()
164 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent amqp.exceptions.MessageNacked
165 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent
166 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent During handling of the above exception, another exception occurred:
167 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent
168 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent Traceback (most recent call last):
169 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/neutron/plugins/ml2/drivers/openvswitch/agent/ovs_neutron_agent.py", line 437, in _report_state
170 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent agent_status = self.state_rpc.report_state(self.context,
171 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
172 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/neutron/agent/rpc.py", line 105, in report_state
173 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return method(context, 'report_state', **kwargs)
174 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
175 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/rpc/client.py", line 190, in call
176 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent result = self.transport._send(
177 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^
178 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/transport.py", line 123, in _send
179 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return self._driver.send(target, ctxt, message,
180 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
181 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send
182 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return self._send(target, ctxt, message, wait_for_reply, timeout,
183 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
184 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 673, in _send
185 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent conn.topic_send(exchange_name=exchange, topic=topic,
186 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/impl_rabbit.py", line 1531, in topic_send
187 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent self._ensure_publishing(self._publish, exchange, msg,
188 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/impl_rabbit.py", line 1383, in _ensure_publishing
189 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent self.ensure(method, retry=retry, error_callback=_error_callback)
190 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/impl_rabbit.py", line 1027, in ensure
191 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent raise exceptions.MessageDeliveryFailure(msg)
192 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent oslo_messaging.exceptions.MessageDeliveryFailure: Unable to connect to AMQP server on rabbitmq02.eqiad1.wikimediacloud.org:5671 after inf tries:
193 2024-11-26 11:34:01.333 1425644 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent
194===== NODE GROUP =====
195(1) cloudvirt1039.eqiad.wmnet
196----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
197Nov 26 02:01:52 cloudvirt1039 neutron-openvswitch-agent[4163565]: 2024-11-26 02:01:52.135 4163565 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:01:52Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
198Nov 26 02:01:52 cloudvirt1039 neutron-openvswitch-agent[4163565]: 2024-11-26 02:01:52.136 4163565 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
199Nov 26 02:02:13 cloudvirt1039 ovs-vsctl[69759]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
200Nov 26 02:02:14 cloudvirt1039 neutron-openvswitch-agent[68547]: 2024-11-26 02:02:14.499 68547 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-910c91e9-cf9e-4bcd-b822-cfa5944ff815 - - - - - -] VIF port: e7d47073-e78f-4fd4-a14a-4f111868635b has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
201Nov 26 02:02:14 cloudvirt1039 neutron-openvswitch-agent[68547]: 2024-11-26 02:02:14.517 68547 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-910c91e9-cf9e-4bcd-b822-cfa5944ff815 - - - - - -] VIF port: 194dc187-4b35-4069-8ba8-0a0b1eb5cc58 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
202Nov 26 02:02:23 cloudvirt1039 neutron-openvswitch-agent[68547]: 2024-11-26 02:02:23.190 68547 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-910c91e9-cf9e-4bcd-b822-cfa5944ff815 - - - - - -] VIF port: 194dc187-4b35-4069-8ba8-0a0b1eb5cc58 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
203Nov 26 02:30:50 cloudvirt1039 neutron-openvswitch-agent[68547]: 2024-11-26 02:30:50.408 68547 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:30:50Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
204Nov 26 02:30:50 cloudvirt1039 neutron-openvswitch-agent[68547]: 2024-11-26 02:30:50.408 68547 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
205Nov 26 02:30:59 cloudvirt1039 ovs-vsctl[85778]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
206Nov 26 02:31:00 cloudvirt1039 neutron-openvswitch-agent[84595]: 2024-11-26 02:31:00.315 84595 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-7b381fcd-a15c-4c64-b66b-498326407908 - - - - - -] VIF port: 194dc187-4b35-4069-8ba8-0a0b1eb5cc58 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
207Nov 26 02:31:00 cloudvirt1039 neutron-openvswitch-agent[84595]: 2024-11-26 02:31:00.338 84595 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-7b381fcd-a15c-4c64-b66b-498326407908 - - - - - -] VIF port: e7d47073-e78f-4fd4-a14a-4f111868635b has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
208Nov 26 02:31:09 cloudvirt1039 neutron-openvswitch-agent[84595]: 2024-11-26 02:31:09.232 84595 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-7b381fcd-a15c-4c64-b66b-498326407908 - - - - - -] VIF port: 194dc187-4b35-4069-8ba8-0a0b1eb5cc58 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
209===== NODE GROUP =====
210(1) cloudvirt1059.eqiad.wmnet
211----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
212Nov 26 02:26:02 cloudvirt1059 neutron-openvswitch-agent[1998409]: 2024-11-26 02:26:02.594 1998409 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:26:02Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
213Nov 26 02:26:02 cloudvirt1059 neutron-openvswitch-agent[1998409]: 2024-11-26 02:26:02.595 1998409 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
214Nov 26 02:26:25 cloudvirt1059 ovs-vsctl[2082178]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
215===== NODE GROUP =====
216(1) cloudvirt1038.eqiad.wmnet
217----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
218Nov 26 02:25:06 cloudvirt1038 neutron-openvswitch-agent[237034]: 2024-11-26 02:25:06.384 237034 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:25:06Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
219Nov 26 02:25:06 cloudvirt1038 neutron-openvswitch-agent[237034]: 2024-11-26 02:25:06.385 237034 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
220Nov 26 02:25:27 cloudvirt1038 ovs-vsctl[352874]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
221Nov 26 02:25:30 cloudvirt1038 neutron-openvswitch-agent[351687]: 2024-11-26 02:25:30.699 351687 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-964b213f-4f61-4a21-ad18-a76042babd81 - - - - - -] VIF port: 7e769c9a-d547-440d-ac83-2c954603985d has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
222===== NODE GROUP =====
223(1) cloudvirt1037.eqiad.wmnet
224----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
225Nov 26 02:21:13 cloudvirt1037 neutron-openvswitch-agent[75768]: 2024-11-26 02:21:13.823 75768 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:21:13Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
226Nov 26 02:21:13 cloudvirt1037 neutron-openvswitch-agent[75768]: 2024-11-26 02:21:13.824 75768 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
227Nov 26 02:21:35 cloudvirt1037 ovs-vsctl[193120]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
228===== NODE GROUP =====
229(1) cloudvirt1057.eqiad.wmnet
230----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
231Nov 26 02:20:53 cloudvirt1057 neutron-openvswitch-agent[2057687]: 2024-11-26 02:20:53.851 2057687 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:20:53Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
232Nov 26 02:20:53 cloudvirt1057 neutron-openvswitch-agent[2057687]: 2024-11-26 02:20:53.851 2057687 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
233Nov 26 02:21:16 cloudvirt1057 ovs-vsctl[2253639]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
234===== NODE GROUP =====
235(1) cloudvirt1058.eqiad.wmnet
236----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
237Nov 26 02:06:17 cloudvirt1058 neutron-openvswitch-agent[2277970]: 2024-11-26 02:06:17.660 2277970 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:06:17Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
238Nov 26 02:06:17 cloudvirt1058 neutron-openvswitch-agent[2277970]: 2024-11-26 02:06:17.660 2277970 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
239Nov 26 02:06:40 cloudvirt1058 ovs-vsctl[2570820]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
240Nov 26 02:06:42 cloudvirt1058 neutron-openvswitch-agent[2569639]: 2024-11-26 02:06:42.781 2569639 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-ff75fa59-044e-4406-8165-975ab29d9fb8 - - - - - -] VIF port: 3c3e2bbf-9a8b-421a-8a94-4942de1bcf5d has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
241Nov 26 02:07:10 cloudvirt1058 neutron-openvswitch-agent[2569639]: 2024-11-26 02:07:10.680 2569639 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-ff75fa59-044e-4406-8165-975ab29d9fb8 - - - - - -] VIF port: 3c3e2bbf-9a8b-421a-8a94-4942de1bcf5d has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
242Nov 26 02:07:24 cloudvirt1058 neutron-openvswitch-agent[2569639]: 2024-11-26 02:07:24.234 2569639 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-ff75fa59-044e-4406-8165-975ab29d9fb8 - - - - - -] VIF port: 3c3e2bbf-9a8b-421a-8a94-4942de1bcf5d has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
243Nov 26 02:36:15 cloudvirt1058 neutron-openvswitch-agent[2569639]: 2024-11-26 02:36:15.629 2569639 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:36:15Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
244Nov 26 02:36:15 cloudvirt1058 neutron-openvswitch-agent[2569639]: 2024-11-26 02:36:15.631 2569639 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
245Nov 26 02:36:26 cloudvirt1058 ovs-vsctl[2587895]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
246Nov 26 02:36:29 cloudvirt1058 neutron-openvswitch-agent[2586707]: 2024-11-26 02:36:29.102 2586707 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-9f00b4a1-093c-46a4-98d9-ac9274aeb741 - - - - - -] VIF port: 3c3e2bbf-9a8b-421a-8a94-4942de1bcf5d has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
247===== NODE GROUP =====
248(1) cloudvirt1036.eqiad.wmnet
249----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
250Nov 26 02:11:53 cloudvirt1036 neutron-openvswitch-agent[226063]: 2024-11-26 02:11:53.126 226063 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:11:53Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
251Nov 26 02:11:53 cloudvirt1036 neutron-openvswitch-agent[226063]: 2024-11-26 02:11:53.126 226063 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
252Nov 26 02:12:14 cloudvirt1036 ovs-vsctl[303470]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
253Nov 26 02:12:16 cloudvirt1036 neutron-openvswitch-agent[302275]: 2024-11-26 02:12:16.513 302275 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-75a43798-4f6b-4e0c-8963-ac0bdaf35a4f - - - - - -] VIF port: 1fa25616-17c4-4ec2-857e-3c8d314f4778 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
254Nov 26 02:12:32 cloudvirt1036 neutron-openvswitch-agent[302275]: 2024-11-26 02:12:32.346 302275 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-75a43798-4f6b-4e0c-8963-ac0bdaf35a4f - - - - - -] VIF port: 1fa25616-17c4-4ec2-857e-3c8d314f4778 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
255Nov 26 02:42:46 cloudvirt1036 neutron-openvswitch-agent[302275]: 2024-11-26 02:42:46.201 302275 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:42:46Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
256Nov 26 02:42:46 cloudvirt1036 neutron-openvswitch-agent[302275]: 2024-11-26 02:42:46.202 302275 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
257Nov 26 02:43:00 cloudvirt1036 ovs-vsctl[320785]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
258Nov 26 02:43:02 cloudvirt1036 neutron-openvswitch-agent[319240]: 2024-11-26 02:43:02.547 319240 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-7508fb86-5840-46c9-affc-7b9e5c1c5ac2 - - - - - -] VIF port: 1fa25616-17c4-4ec2-857e-3c8d314f4778 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
259Nov 26 02:43:18 cloudvirt1036 neutron-openvswitch-agent[319240]: 2024-11-26 02:43:18.621 319240 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-7508fb86-5840-46c9-affc-7b9e5c1c5ac2 - - - - - -] VIF port: 1fa25616-17c4-4ec2-857e-3c8d314f4778 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
260===== NODE GROUP =====
261(1) cloudvirt1056.eqiad.wmnet
262----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
263Nov 26 02:30:03 cloudvirt1056 neutron-openvswitch-agent[3483232]: 2024-11-26 02:30:03.892 3483232 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:30:03Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
264Nov 26 02:30:03 cloudvirt1056 neutron-openvswitch-agent[3483232]: 2024-11-26 02:30:03.892 3483232 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
265Nov 26 02:30:27 cloudvirt1056 ovs-vsctl[3596693]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
266===== NODE GROUP =====
267(1) cloudvirt1055.eqiad.wmnet
268----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
269Nov 26 02:06:46 cloudvirt1055 neutron-openvswitch-agent[3572196]: 2024-11-26 02:06:46.537 3572196 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:06:46Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
270Nov 26 02:06:46 cloudvirt1055 neutron-openvswitch-agent[3572196]: 2024-11-26 02:06:46.537 3572196 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
271Nov 26 02:07:09 cloudvirt1055 ovs-vsctl[3682262]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
272Nov 26 02:35:52 cloudvirt1055 neutron-openvswitch-agent[3681068]: 2024-11-26 02:35:52.226 3681068 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:35:52Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
273Nov 26 02:35:52 cloudvirt1055 neutron-openvswitch-agent[3681068]: 2024-11-26 02:35:52.229 3681068 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
274Nov 26 02:36:06 cloudvirt1055 ovs-vsctl[3698492]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
275===== NODE GROUP =====
276(1) cloudvirt1035.eqiad.wmnet
277----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
278Nov 23 09:04:58 cloudvirt1035 neutron-openvswitch-agent[281153]: 2024-11-23 09:04:58.687 281153 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a952275c-5a49-407b-bd8a-99664480b49a - - - - - -] VIF port: 2009eed5-398d-4422-b6b5-b2b2c838665b has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
279Nov 23 12:00:24 cloudvirt1035 neutron-openvswitch-agent[281153]: 2024-11-23 12:00:24.454 281153 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a952275c-5a49-407b-bd8a-99664480b49a - - - - - -] VIF port: 3f85e787-1760-444a-a5df-28e6b5bf21b5 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
280Nov 23 12:00:32 cloudvirt1035 neutron-openvswitch-agent[281153]: 2024-11-23 12:00:32.448 281153 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a952275c-5a49-407b-bd8a-99664480b49a - - - - - -] VIF port: b9be1bcf-d53a-4f1d-84f3-49d932acbe0e has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
281Nov 23 12:00:34 cloudvirt1035 neutron-openvswitch-agent[281153]: 2024-11-23 12:00:34.450 281153 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a952275c-5a49-407b-bd8a-99664480b49a - - - - - -] VIF port: b9be1bcf-d53a-4f1d-84f3-49d932acbe0e has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
282Nov 23 18:38:02 cloudvirt1035 neutron-openvswitch-agent[281153]: 2024-11-23 18:38:02.207 281153 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a952275c-5a49-407b-bd8a-99664480b49a - - - - - -] VIF port: d86f6618-181f-4f9c-9355-6c2d7b8bc1bb has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
283Nov 24 00:54:26 cloudvirt1035 neutron-openvswitch-agent[281153]: 2024-11-24 00:54:26.708 281153 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a952275c-5a49-407b-bd8a-99664480b49a - - - - - -] VIF port: 2dc22731-17cc-4152-a9fb-7573090a3141 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
284Nov 24 06:23:45 cloudvirt1035 neutron-openvswitch-agent[281153]: 2024-11-24 06:23:45.551 281153 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a952275c-5a49-407b-bd8a-99664480b49a - - - - - -] VIF port: 22ddca87-42c0-4e77-adc9-efb293350ca6 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
285Nov 24 07:36:46 cloudvirt1035 neutron-openvswitch-agent[281153]: 2024-11-24 07:36:46.745 281153 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a952275c-5a49-407b-bd8a-99664480b49a - - - - - -] VIF port: d7c44d06-cb0f-4a32-807e-1013d7271206 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
286Nov 24 12:00:28 cloudvirt1035 neutron-openvswitch-agent[281153]: 2024-11-24 12:00:28.446 281153 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a952275c-5a49-407b-bd8a-99664480b49a - - - - - -] VIF port: ca5ef2d9-e376-40e9-bfc0-97e418a230c1 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
287Nov 24 12:00:40 cloudvirt1035 neutron-openvswitch-agent[281153]: 2024-11-24 12:00:40.457 281153 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a952275c-5a49-407b-bd8a-99664480b49a - - - - - -] VIF port: 70225b72-6137-403e-92a7-67a0a964a3d3 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
288Nov 24 16:16:27 cloudvirt1035 neutron-openvswitch-agent[281153]: 2024-11-24 16:16:27.734 281153 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a952275c-5a49-407b-bd8a-99664480b49a - - - - - -] VIF port: 3e5bd1f6-6ca3-472e-8387-e49e860e801c has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
289Nov 24 16:37:38 cloudvirt1035 neutron-openvswitch-agent[281153]: 2024-11-24 16:37:38.666 281153 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a952275c-5a49-407b-bd8a-99664480b49a - - - - - -] VIF port: 93757ebc-f207-46b4-b0ae-f312dc223e1b has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
290Nov 24 16:48:13 cloudvirt1035 neutron-openvswitch-agent[281153]: 2024-11-24 16:48:13.120 281153 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a952275c-5a49-407b-bd8a-99664480b49a - - - - - -] VIF port: b026eb0b-b3fa-4914-8436-6ee5abc36340 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
291Nov 24 17:30:39 cloudvirt1035 neutron-openvswitch-agent[281153]: 2024-11-24 17:30:39.016 281153 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a952275c-5a49-407b-bd8a-99664480b49a - - - - - -] VIF port: c894b7a3-e1cc-48b5-a70e-2960743acbec has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
292Nov 25 05:58:41 cloudvirt1035 neutron-openvswitch-agent[281153]: 2024-11-25 05:58:41.902 281153 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a952275c-5a49-407b-bd8a-99664480b49a - - - - - -] VIF port: 70914dfa-6ab1-4a44-aeba-bd4d463356e6 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
293Nov 25 14:43:28 cloudvirt1035 neutron-openvswitch-agent[281153]: 2024-11-25 14:43:28.926 281153 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a952275c-5a49-407b-bd8a-99664480b49a - - - - - -] VIF port: ad7c7d75-7c5d-4707-bda9-86db2802a988 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
294Nov 26 02:01:24 cloudvirt1035 neutron-openvswitch-agent[281153]: 2024-11-26 02:01:24.695 281153 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:01:24Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
295Nov 26 02:01:24 cloudvirt1035 neutron-openvswitch-agent[281153]: 2024-11-26 02:01:24.696 281153 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
296Nov 26 02:01:45 cloudvirt1035 ovs-vsctl[330261]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
297Nov 26 02:31:41 cloudvirt1035 neutron-openvswitch-agent[329085]: 2024-11-26 02:31:41.750 329085 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:31:41Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
298Nov 26 02:31:41 cloudvirt1035 neutron-openvswitch-agent[329085]: 2024-11-26 02:31:41.751 329085 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
299Nov 26 02:31:56 cloudvirt1035 ovs-vsctl[347968]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
300Nov 26 03:24:04 cloudvirt1035 neutron-openvswitch-agent[346715]: 2024-11-26 03:24:04.263 346715 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-13c84754-d718-4377-bd76-12978f7571b1 - - - - - -] VIF port: 013af123-f41f-4c14-800b-b5a363e471d8 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
301Nov 26 05:25:59 cloudvirt1035 neutron-openvswitch-agent[346715]: 2024-11-26 05:25:59.711 346715 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-13c84754-d718-4377-bd76-12978f7571b1 - - - - - -] VIF port: 333f9431-cb8d-471f-b52f-a4ccffc149b0 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
302Nov 26 07:08:42 cloudvirt1035 neutron-openvswitch-agent[346715]: 2024-11-26 07:08:42.302 346715 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-13c84754-d718-4377-bd76-12978f7571b1 - - - - - -] VIF port: b28c57d6-324e-41c5-9f31-511a4475163a has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
303Nov 26 08:09:55 cloudvirt1035 neutron-openvswitch-agent[346715]: 2024-11-26 08:09:55.163 346715 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-13c84754-d718-4377-bd76-12978f7571b1 - - - - - -] VIF port: e8033ecc-a633-4d6e-87a0-6e1e9b312bd5 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
304Nov 26 08:51:04 cloudvirt1035 neutron-openvswitch-agent[346715]: 2024-11-26 08:51:04.988 346715 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-13c84754-d718-4377-bd76-12978f7571b1 - - - - - -] VIF port: 93ba9652-f000-4edd-8f38-ef87796b6cd1 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
305Nov 26 10:21:27 cloudvirt1035 neutron-openvswitch-agent[346715]: 2024-11-26 10:21:27.011 346715 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-13c84754-d718-4377-bd76-12978f7571b1 - - - - - -] VIF port: 62b99a48-a321-40e4-b36d-a7497a3c231a has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
306Nov 26 12:22:48 cloudvirt1035 neutron-openvswitch-agent[346715]: 2024-11-26 12:22:48.407 346715 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-13c84754-d718-4377-bd76-12978f7571b1 - - - - - -] VIF port: e48b13a1-2b22-43d9-ba6c-5cd6f6a624fc has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
307===== NODE GROUP =====
308(1) cloudvirt1034.eqiad.wmnet
309----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
310Nov 26 02:23:47 cloudvirt1034 neutron-openvswitch-agent[2330594]: 2024-11-26 02:23:47.099 2330594 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:23:47Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
311Nov 26 02:23:47 cloudvirt1034 neutron-openvswitch-agent[2330594]: 2024-11-26 02:23:47.100 2330594 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
312Nov 26 02:24:08 cloudvirt1034 ovs-vsctl[2301196]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
313===== NODE GROUP =====
314(1) cloudvirt1054.eqiad.wmnet
315----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
316Nov 26 02:06:15 cloudvirt1054 neutron-openvswitch-agent[3439857]: 2024-11-26 02:06:15.182 3439857 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:06:15Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
317Nov 26 02:06:15 cloudvirt1054 neutron-openvswitch-agent[3439857]: 2024-11-26 02:06:15.182 3439857 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
318Nov 26 02:06:37 cloudvirt1054 ovs-vsctl[3555191]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
319Nov 26 02:36:58 cloudvirt1054 neutron-openvswitch-agent[3554023]: 2024-11-26 02:36:58.025 3554023 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:36:58Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
320Nov 26 02:36:58 cloudvirt1054 neutron-openvswitch-agent[3554023]: 2024-11-26 02:36:58.028 3554023 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
321Nov 26 02:37:09 cloudvirt1054 ovs-vsctl[3572598]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
322===== NODE GROUP =====
323(1) cloudvirt1053.eqiad.wmnet
324----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
325Nov 26 02:22:02 cloudvirt1053 neutron-openvswitch-agent[1627352]: 2024-11-26 02:22:02.140 1627352 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:22:02Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
326Nov 26 02:22:02 cloudvirt1053 neutron-openvswitch-agent[1627352]: 2024-11-26 02:22:02.140 1627352 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
327Nov 26 02:22:25 cloudvirt1053 ovs-vsctl[1690761]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
328===== NODE GROUP =====
329(1) cloudvirt1033.eqiad.wmnet
330----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
331Nov 26 02:08:43 cloudvirt1033 neutron-openvswitch-agent[3183103]: 2024-11-26 02:08:43.687 3183103 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:08:43Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
332Nov 26 02:08:43 cloudvirt1033 neutron-openvswitch-agent[3183103]: 2024-11-26 02:08:43.687 3183103 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
333Nov 26 02:09:04 cloudvirt1033 ovs-vsctl[3086274]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
334Nov 26 02:38:33 cloudvirt1033 neutron-openvswitch-agent[3084794]: 2024-11-26 02:38:33.788 3084794 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:38:33Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
335Nov 26 02:38:33 cloudvirt1033 neutron-openvswitch-agent[3084794]: 2024-11-26 02:38:33.789 3084794 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
336Nov 26 02:38:50 cloudvirt1033 ovs-vsctl[3102607]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
337===== NODE GROUP =====
338(1) cloudvirt1032.eqiad.wmnet
339----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
340Nov 26 02:07:47 cloudvirt1032 neutron-openvswitch-agent[2993514]: 2024-11-26 02:07:47.063 2993514 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:07:47Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
341Nov 26 02:07:47 cloudvirt1032 neutron-openvswitch-agent[2993514]: 2024-11-26 02:07:47.064 2993514 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
342Nov 26 02:08:08 cloudvirt1032 ovs-vsctl[2961784]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
343Nov 26 02:37:16 cloudvirt1032 neutron-openvswitch-agent[2960590]: 2024-11-26 02:37:16.475 2960590 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:37:16Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
344Nov 26 02:37:16 cloudvirt1032 neutron-openvswitch-agent[2960590]: 2024-11-26 02:37:16.478 2960590 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
345Nov 26 02:37:29 cloudvirt1032 ovs-vsctl[2978086]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
346===== NODE GROUP =====
347(1) cloudvirt1052.eqiad.wmnet
348----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
349Nov 26 02:13:45 cloudvirt1052 neutron-openvswitch-agent[2242097]: 2024-11-26 02:13:45.124 2242097 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:13:45Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
350Nov 26 02:13:45 cloudvirt1052 neutron-openvswitch-agent[2242097]: 2024-11-26 02:13:45.124 2242097 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
351Nov 26 02:14:09 cloudvirt1052 ovs-vsctl[2297647]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
352Nov 26 02:14:12 cloudvirt1052 neutron-openvswitch-agent[2296236]: 2024-11-26 02:14:12.498 2296236 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-f756fb12-1297-4cfc-8f3c-40a267eb3a3a - - - - - -] VIF port: bbcde4f0-35f7-46b4-815a-f5121c184e4b has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
353Nov 26 02:14:12 cloudvirt1052 neutron-openvswitch-agent[2296236]: 2024-11-26 02:14:12.570 2296236 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-f756fb12-1297-4cfc-8f3c-40a267eb3a3a - - - - - -] VIF port: dccee2b4-3d58-4b16-a3a5-b0bb67292ce7 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
354Nov 26 02:14:28 cloudvirt1052 neutron-openvswitch-agent[2296236]: 2024-11-26 02:14:28.026 2296236 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-f756fb12-1297-4cfc-8f3c-40a267eb3a3a - - - - - -] VIF port: bbcde4f0-35f7-46b4-815a-f5121c184e4b has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
355Nov 26 02:42:52 cloudvirt1052 neutron-openvswitch-agent[2296236]: 2024-11-26 02:42:52.147 2296236 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:42:52Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
356Nov 26 02:42:52 cloudvirt1052 neutron-openvswitch-agent[2296236]: 2024-11-26 02:42:52.149 2296236 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
357Nov 26 02:43:06 cloudvirt1052 ovs-vsctl[2313883]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
358Nov 26 02:43:09 cloudvirt1052 neutron-openvswitch-agent[2312391]: 2024-11-26 02:43:09.010 2312391 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-e02d1e9a-6d5c-4a00-a2fe-36f61aa068e0 - - - - - -] VIF port: bbcde4f0-35f7-46b4-815a-f5121c184e4b has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
359Nov 26 02:43:09 cloudvirt1052 neutron-openvswitch-agent[2312391]: 2024-11-26 02:43:09.045 2312391 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-e02d1e9a-6d5c-4a00-a2fe-36f61aa068e0 - - - - - -] VIF port: dccee2b4-3d58-4b16-a3a5-b0bb67292ce7 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
360Nov 26 02:43:23 cloudvirt1052 neutron-openvswitch-agent[2312391]: 2024-11-26 02:43:23.284 2312391 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-e02d1e9a-6d5c-4a00-a2fe-36f61aa068e0 - - - - - -] VIF port: bbcde4f0-35f7-46b4-815a-f5121c184e4b has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
361===== NODE GROUP =====
362(1) cloudvirt1031.eqiad.wmnet
363----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
364Nov 26 02:05:43 cloudvirt1031 neutron-openvswitch-agent[3981938]: 2024-11-26 02:05:43.952 3981938 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:05:43Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
365Nov 26 02:05:43 cloudvirt1031 neutron-openvswitch-agent[3981938]: 2024-11-26 02:05:43.952 3981938 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
366Nov 26 02:06:05 cloudvirt1031 ovs-vsctl[3951287]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
367Nov 26 02:35:13 cloudvirt1031 neutron-openvswitch-agent[3949775]: 2024-11-26 02:35:13.193 3949775 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:35:13Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
368Nov 26 02:35:13 cloudvirt1031 neutron-openvswitch-agent[3949775]: 2024-11-26 02:35:13.194 3949775 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
369Nov 26 02:35:30 cloudvirt1031 ovs-vsctl[3967611]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
370Nov 26 11:34:00 cloudvirt1031 neutron-openvswitch-agent[3966403]: 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [-] Failed reporting state!: oslo_messaging.exceptions.MessageDeliveryFailure: Unable to connect to AMQP server on rabbitmq02.eqiad1.wikimediacloud.org:5671 after inf tries:
371 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent Traceback (most recent call last):
372 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/impl_rabbit.py", line 1005, in ensure
373 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ret, channel = autoretry_method()
374 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^
375 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/kombu/connection.py", line 523, in _ensured
376 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return fun(*args, **kwargs)
377 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^
378 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/kombu/connection.py", line 599, in __call__
379 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return fun(*args, channel=channels[0], **kwargs), channels[0]
380 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
381 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/impl_rabbit.py", line 994, in execute_method
382 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent method()
383 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/impl_rabbit.py", line 1432, in _publish
384 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent self._producer.publish(
385 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/kombu/messaging.py", line 177, in publish
386 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return _publish(
387 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^
388 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/kombu/messaging.py", line 199, in _publish
389 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return channel.basic_publish(
390 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^
391 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/channel.py", line 1820, in basic_publish_confirm
392 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent self.wait([spec.Basic.Ack, spec.Basic.Nack],
393 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/abstract_channel.py", line 99, in wait
394 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent self.connection.drain_events(timeout=timeout)
395 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/connection.py", line 525, in drain_events
396 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent while not self.blocking_read(timeout):
397 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^
398 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/connection.py", line 531, in blocking_read
399 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return self.on_inbound_frame(frame)
400 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
401 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/method_framing.py", line 53, in on_frame
402 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent callback(channel, method_sig, buf, None)
403 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/connection.py", line 537, in on_inbound_method
404 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return self.channels[channel_id].dispatch_method(
405 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
406 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/abstract_channel.py", line 159, in dispatch_method
407 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent one_shot(method_sig, *args)
408 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/vine/promises.py", line 160, in __call__
409 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return self.throw()
410 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^
411 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/vine/promises.py", line 157, in __call__
412 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent retval = fun(*final_args, **final_kwargs)
413 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
414 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/channel.py", line 1812, in confirm_handler
415 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent raise MessageNacked()
416 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent amqp.exceptions.MessageNacked
417 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent
418 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent During handling of the above exception, another exception occurred:
419 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent
420 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent Traceback (most recent call last):
421 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/neutron/plugins/ml2/drivers/openvswitch/agent/ovs_neutron_agent.py", line 437, in _report_state
422 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent agent_status = self.state_rpc.report_state(self.context,
423 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
424 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/neutron/agent/rpc.py", line 105, in report_state
425 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return method(context, 'report_state', **kwargs)
426 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
427 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/rpc/client.py", line 190, in call
428 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent result = self.transport._send(
429 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^
430 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/transport.py", line 123, in _send
431 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return self._driver.send(target, ctxt, message,
432 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
433 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send
434 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return self._send(target, ctxt, message, wait_for_reply, timeout,
435 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
436 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 673, in _send
437 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent conn.topic_send(exchange_name=exchange, topic=topic,
438 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/impl_rabbit.py", line 1531, in topic_send
439 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent self._ensure_publishing(self._publish, exchange, msg,
440 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/impl_rabbit.py", line 1383, in _ensure_publishing
441 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent self.ensure(method, retry=retry, error_callback=_error_callback)
442 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/impl_rabbit.py", line 1027, in ensure
443 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent raise exceptions.MessageDeliveryFailure(msg)
444 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent oslo_messaging.exceptions.MessageDeliveryFailure: Unable to connect to AMQP server on rabbitmq02.eqiad1.wikimediacloud.org:5671 after inf tries:
445 2024-11-26 11:34:00.984 3966403 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent
446===== NODE GROUP =====
447(1) cloudvirt1051.eqiad.wmnet
448----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
449Nov 26 02:01:22 cloudvirt1051 neutron-openvswitch-agent[2450422]: 2024-11-26 02:01:22.691 2450422 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:01:22Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
450Nov 26 02:01:22 cloudvirt1051 neutron-openvswitch-agent[2450422]: 2024-11-26 02:01:22.692 2450422 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
451Nov 26 02:01:45 cloudvirt1051 ovs-vsctl[3469236]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
452Nov 26 02:01:47 cloudvirt1051 neutron-openvswitch-agent[3468039]: 2024-11-26 02:01:47.775 3468039 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-867056ea-4352-4c79-872e-b1617f67bb51 - - - - - -] VIF port: 67c689bd-44b5-4777-be15-456b8bb7e2e8 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
453Nov 26 02:01:47 cloudvirt1051 neutron-openvswitch-agent[3468039]: 2024-11-26 02:01:47.787 3468039 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-867056ea-4352-4c79-872e-b1617f67bb51 - - - - - -] VIF port: b225eea5-63ec-4d28-ae23-8632566f6eed has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
454Nov 26 02:30:47 cloudvirt1051 neutron-openvswitch-agent[3468039]: 2024-11-26 02:30:47.275 3468039 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:30:47Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
455Nov 26 02:30:47 cloudvirt1051 neutron-openvswitch-agent[3468039]: 2024-11-26 02:30:47.276 3468039 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
456Nov 26 02:31:01 cloudvirt1051 ovs-vsctl[3486476]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
457Nov 26 02:31:04 cloudvirt1051 neutron-openvswitch-agent[3484952]: 2024-11-26 02:31:04.166 3484952 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-b7dff795-a061-48d7-934e-d04bcd09ecc7 - - - - - -] VIF port: b225eea5-63ec-4d28-ae23-8632566f6eed has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
458Nov 26 02:31:04 cloudvirt1051 neutron-openvswitch-agent[3484952]: 2024-11-26 02:31:04.235 3484952 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-b7dff795-a061-48d7-934e-d04bcd09ecc7 - - - - - -] VIF port: 67c689bd-44b5-4777-be15-456b8bb7e2e8 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
459===== NODE GROUP =====
460(1) cloudvirt1049.eqiad.wmnet
461----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
462Nov 26 02:23:15 cloudvirt1049 neutron-openvswitch-agent[2419670]: 2024-11-26 02:23:15.316 2419670 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:23:15Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
463Nov 26 02:23:15 cloudvirt1049 neutron-openvswitch-agent[2419670]: 2024-11-26 02:23:15.317 2419670 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
464Nov 26 02:23:39 cloudvirt1049 ovs-vsctl[2544177]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
465Nov 26 02:23:41 cloudvirt1049 neutron-openvswitch-agent[2542995]: 2024-11-26 02:23:41.570 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6491ffde-a94b-4f5d-9995-f7be484aec52 - - - - - -] VIF port: c00b99b9-d073-4eae-aa93-47eb21f44852 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
466Nov 26 02:23:41 cloudvirt1049 neutron-openvswitch-agent[2542995]: 2024-11-26 02:23:41.595 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6491ffde-a94b-4f5d-9995-f7be484aec52 - - - - - -] VIF port: dc466571-ce4c-4058-82bd-66a8871d393c has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
467Nov 26 11:34:08 cloudvirt1049 neutron-openvswitch-agent[2542995]: 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [-] Failed reporting state!: oslo_messaging.exceptions.MessageDeliveryFailure: Unable to connect to AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 after inf tries:
468 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent Traceback (most recent call last):
469 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/impl_rabbit.py", line 1005, in ensure
470 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ret, channel = autoretry_method()
471 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^
472 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/kombu/connection.py", line 523, in _ensured
473 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return fun(*args, **kwargs)
474 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^
475 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/kombu/connection.py", line 599, in __call__
476 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return fun(*args, channel=channels[0], **kwargs), channels[0]
477 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
478 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/impl_rabbit.py", line 994, in execute_method
479 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent method()
480 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/impl_rabbit.py", line 1432, in _publish
481 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent self._producer.publish(
482 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/kombu/messaging.py", line 177, in publish
483 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return _publish(
484 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^
485 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/kombu/messaging.py", line 199, in _publish
486 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return channel.basic_publish(
487 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^
488 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/channel.py", line 1820, in basic_publish_confirm
489 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent self.wait([spec.Basic.Ack, spec.Basic.Nack],
490 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/abstract_channel.py", line 99, in wait
491 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent self.connection.drain_events(timeout=timeout)
492 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/connection.py", line 525, in drain_events
493 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent while not self.blocking_read(timeout):
494 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^
495 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/connection.py", line 531, in blocking_read
496 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return self.on_inbound_frame(frame)
497 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
498 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/method_framing.py", line 53, in on_frame
499 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent callback(channel, method_sig, buf, None)
500 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/connection.py", line 537, in on_inbound_method
501 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return self.channels[channel_id].dispatch_method(
502 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
503 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/abstract_channel.py", line 159, in dispatch_method
504 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent one_shot(method_sig, *args)
505 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/vine/promises.py", line 160, in __call__
506 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return self.throw()
507 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^
508 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/vine/promises.py", line 157, in __call__
509 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent retval = fun(*final_args, **final_kwargs)
510 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
511 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/amqp/channel.py", line 1812, in confirm_handler
512 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent raise MessageNacked()
513 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent amqp.exceptions.MessageNacked
514 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent
515 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent During handling of the above exception, another exception occurred:
516 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent
517 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent Traceback (most recent call last):
518 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/neutron/plugins/ml2/drivers/openvswitch/agent/ovs_neutron_agent.py", line 437, in _report_state
519 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent agent_status = self.state_rpc.report_state(self.context,
520 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
521 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/neutron/agent/rpc.py", line 105, in report_state
522 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return method(context, 'report_state', **kwargs)
523 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
524 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/rpc/client.py", line 190, in call
525 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent result = self.transport._send(
526 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^
527 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/transport.py", line 123, in _send
528 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return self._driver.send(target, ctxt, message,
529 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
530 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send
531 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return self._send(target, ctxt, message, wait_for_reply, timeout,
532 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
533 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 673, in _send
534 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent conn.topic_send(exchange_name=exchange, topic=topic,
535 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/impl_rabbit.py", line 1531, in topic_send
536 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent self._ensure_publishing(self._publish, exchange, msg,
537 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/impl_rabbit.py", line 1383, in _ensure_publishing
538 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent self.ensure(method, retry=retry, error_callback=_error_callback)
539 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/oslo_messaging/_drivers/impl_rabbit.py", line 1027, in ensure
540 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent raise exceptions.MessageDeliveryFailure(msg)
541 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent oslo_messaging.exceptions.MessageDeliveryFailure: Unable to connect to AMQP server on rabbitmq03.eqiad1.wikimediacloud.org:5671 after inf tries:
542 2024-11-26 11:34:08.106 2542995 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent
543===== NODE GROUP =====
544(1) cloudvirt1050.eqiad.wmnet
545----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
546Nov 26 02:23:35 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:23:35.225 2423439 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:23:35Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
547Nov 26 02:23:35 cloudvirt1050 neutron-openvswitch-agent[2423439]: 2024-11-26 02:23:35.226 2423439 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
548Nov 26 02:23:58 cloudvirt1050 ovs-vsctl[2502996]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
549===== NODE GROUP =====
550(1) cloudvirt1048.eqiad.wmnet
551----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
552Nov 26 02:12:13 cloudvirt1048 neutron-openvswitch-agent[1950]: 2024-11-26 02:12:13.544 1950 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:12:13Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
553Nov 26 02:12:13 cloudvirt1048 neutron-openvswitch-agent[1950]: 2024-11-26 02:12:13.546 1950 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
554Nov 26 02:12:36 cloudvirt1048 ovs-vsctl[1072522]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
555Nov 26 02:12:37 cloudvirt1048 neutron-openvswitch-agent[1071347]: 2024-11-26 02:12:37.555 1071347 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-d72d2aa7-61fd-4635-97b6-97ba8abae398 - - - - - -] VIF port: 01fdbef4-ba33-498f-a927-f799897e9fea has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
556Nov 26 02:42:10 cloudvirt1048 neutron-openvswitch-agent[1071347]: 2024-11-26 02:42:10.714 1071347 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:42:10Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
557Nov 26 02:42:10 cloudvirt1048 neutron-openvswitch-agent[1071347]: 2024-11-26 02:42:10.715 1071347 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
558Nov 26 02:42:23 cloudvirt1048 ovs-vsctl[1089163]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
559Nov 26 02:42:24 cloudvirt1048 neutron-openvswitch-agent[1087982]: 2024-11-26 02:42:24.513 1087982 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-a6e045a4-7eb8-4d7e-bcf4-1692f8470315 - - - - - -] VIF port: 01fdbef4-ba33-498f-a927-f799897e9fea has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
560===== NODE GROUP =====
561(1) cloudvirt1047.eqiad.wmnet
562----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
563Nov 26 02:25:46 cloudvirt1047 neutron-openvswitch-agent[2406974]: 2024-11-26 02:25:46.698 2406974 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:25:46Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
564Nov 26 02:25:46 cloudvirt1047 neutron-openvswitch-agent[2406974]: 2024-11-26 02:25:46.698 2406974 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
565Nov 26 02:26:08 cloudvirt1047 ovs-vsctl[2522963]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
566Nov 26 02:26:10 cloudvirt1047 neutron-openvswitch-agent[2521786]: 2024-11-26 02:26:10.002 2521786 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-fe06e5e9-26cd-4c5c-b98f-6c81b8262028 - - - - - -] VIF port: 91947417-07b1-40a2-861f-9a1ec06f3042 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
567Nov 26 02:26:10 cloudvirt1047 neutron-openvswitch-agent[2521786]: 2024-11-26 02:26:10.069 2521786 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-fe06e5e9-26cd-4c5c-b98f-6c81b8262028 - - - - - -] VIF port: 5d6fb6fc-fa86-4fc3-94c7-2ca45751bd3f has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
568===== NODE GROUP =====
569(1) cloudvirt1046.eqiad.wmnet
570----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
571Nov 26 02:12:07 cloudvirt1046 neutron-openvswitch-agent[2497746]: 2024-11-26 02:12:07.170 2497746 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:12:07Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
572Nov 26 02:12:07 cloudvirt1046 neutron-openvswitch-agent[2497746]: 2024-11-26 02:12:07.170 2497746 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
573Nov 26 02:12:27 cloudvirt1046 ovs-vsctl[2740020]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
574Nov 26 02:41:06 cloudvirt1046 neutron-openvswitch-agent[2738857]: 2024-11-26 02:41:06.578 2738857 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:41:06Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
575Nov 26 02:41:06 cloudvirt1046 neutron-openvswitch-agent[2738857]: 2024-11-26 02:41:06.579 2738857 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
576Nov 26 02:41:17 cloudvirt1046 ovs-vsctl[2756346]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
577===== NODE GROUP =====
578(1) cloudvirt1066.eqiad.wmnet
579----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
580Nov 26 02:09:53 cloudvirt1066 neutron-openvswitch-agent[3402432]: 2024-11-26 02:09:53.550 3402432 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:09:53Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
581Nov 26 02:09:53 cloudvirt1066 neutron-openvswitch-agent[3402432]: 2024-11-26 02:09:53.551 3402432 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
582Nov 26 02:10:19 cloudvirt1066 ovs-vsctl[786822]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
583Nov 26 02:10:22 cloudvirt1066 neutron-openvswitch-agent[785614]: 2024-11-26 02:10:22.312 785614 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-c8f45b4e-9b45-4194-a542-2805886c896b - - - - - -] VIF port: fea0787a-8bf7-4505-a70c-f04aa2216451 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
584Nov 26 02:39:23 cloudvirt1066 neutron-openvswitch-agent[785614]: 2024-11-26 02:39:23.110 785614 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:39:23Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
585Nov 26 02:39:23 cloudvirt1066 neutron-openvswitch-agent[785614]: 2024-11-26 02:39:23.111 785614 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
586Nov 26 02:39:44 cloudvirt1066 ovs-vsctl[803860]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
587Nov 26 02:39:47 cloudvirt1066 neutron-openvswitch-agent[802668]: 2024-11-26 02:39:47.562 802668 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-ad998c55-260d-48cf-8462-88b76924a7d8 - - - - - -] VIF port: fea0787a-8bf7-4505-a70c-f04aa2216451 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
588===== NODE GROUP =====
589(1) cloudvirt1067.eqiad.wmnet
590----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
591Nov 26 02:02:03 cloudvirt1067 neutron-openvswitch-agent[3255333]: 2024-11-26 02:02:03.111 3255333 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:02:03Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
592Nov 26 02:02:03 cloudvirt1067 neutron-openvswitch-agent[3255333]: 2024-11-26 02:02:03.112 3255333 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
593Nov 26 02:02:26 cloudvirt1067 ovs-vsctl[591801]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
594Nov 26 02:02:27 cloudvirt1067 neutron-openvswitch-agent[590612]: 2024-11-26 02:02:27.962 590612 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-afcc03cf-561b-43b1-ae52-23c6fff00fc9 - - - - - -] VIF port: 99e776e3-d6fd-4c08-bad5-df76e40ebffb has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
595Nov 26 02:02:27 cloudvirt1067 neutron-openvswitch-agent[590612]: 2024-11-26 02:02:27.990 590612 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-afcc03cf-561b-43b1-ae52-23c6fff00fc9 - - - - - -] VIF port: 2b8657d1-bcc0-4a85-a96f-a6568feaf1c8 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
596Nov 26 02:31:15 cloudvirt1067 neutron-openvswitch-agent[590612]: 2024-11-26 02:31:15.232 590612 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:31:15Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
597Nov 26 02:31:15 cloudvirt1067 neutron-openvswitch-agent[590612]: 2024-11-26 02:31:15.232 590612 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
598Nov 26 02:31:28 cloudvirt1067 ovs-vsctl[608638]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
599Nov 26 02:31:30 cloudvirt1067 neutron-openvswitch-agent[607450]: 2024-11-26 02:31:30.359 607450 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-8fc727a6-b925-454f-84dd-c9a01056e77e - - - - - -] VIF port: 99e776e3-d6fd-4c08-bad5-df76e40ebffb has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
600Nov 26 02:31:30 cloudvirt1067 neutron-openvswitch-agent[607450]: 2024-11-26 02:31:30.389 607450 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-8fc727a6-b925-454f-84dd-c9a01056e77e - - - - - -] VIF port: 2b8657d1-bcc0-4a85-a96f-a6568feaf1c8 has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
601===== NODE GROUP =====
602(1) cloudvirt1044.eqiad.wmnet
603----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
604Nov 26 02:03:27 cloudvirt1044 neutron-openvswitch-agent[2687222]: 2024-11-26 02:03:27.102 2687222 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:03:27Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
605Nov 26 02:03:27 cloudvirt1044 neutron-openvswitch-agent[2687222]: 2024-11-26 02:03:27.104 2687222 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
606Nov 26 02:03:48 cloudvirt1044 ovs-vsctl[2797092]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
607Nov 26 02:33:12 cloudvirt1044 neutron-openvswitch-agent[2795925]: 2024-11-26 02:33:12.625 2795925 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:33:12Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
608Nov 26 02:33:12 cloudvirt1044 neutron-openvswitch-agent[2795925]: 2024-11-26 02:33:12.626 2795925 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
609Nov 26 02:33:24 cloudvirt1044 ovs-vsctl[2813963]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
610===== NODE GROUP =====
611(1) cloudvirt1065.eqiad.wmnet
612----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
613Nov 26 02:07:39 cloudvirt1065 neutron-openvswitch-agent[3230225]: 2024-11-26 02:07:39.808 3230225 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:07:39Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
614Nov 26 02:07:39 cloudvirt1065 neutron-openvswitch-agent[3230225]: 2024-11-26 02:07:39.808 3230225 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
615Nov 26 02:08:03 cloudvirt1065 ovs-vsctl[561837]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
616Nov 26 02:08:05 cloudvirt1065 neutron-openvswitch-agent[560337]: 2024-11-26 02:08:05.242 560337 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-6bbeaac0-325a-406e-bd55-a542f2de4d64 - - - - - -] VIF port: 0d91f13a-9b63-4e97-9229-3226d451cd3f has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
617Nov 26 02:37:01 cloudvirt1065 neutron-openvswitch-agent[560337]: 2024-11-26 02:37:01.987 560337 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:37:01Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
618Nov 26 02:37:01 cloudvirt1065 neutron-openvswitch-agent[560337]: 2024-11-26 02:37:01.988 560337 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
619Nov 26 02:37:16 cloudvirt1065 ovs-vsctl[578730]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
620Nov 26 02:37:18 cloudvirt1065 neutron-openvswitch-agent[577534]: 2024-11-26 02:37:18.130 577534 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [None req-45c2d713-2bd2-48d5-a243-2312e5e4b60d - - - - - -] VIF port: 0d91f13a-9b63-4e97-9229-3226d451cd3f has no ofport configured or is invalid, and might not be able to transmit. (ofport=-1)
621===== NODE GROUP =====
622(1) cloudvirt1045.eqiad.wmnet
623----- OUTPUT of 'journalctl --pri...11-23 | grep ovs' -----
624Nov 26 02:17:23 cloudvirt1045 neutron-openvswitch-agent[2465293]: 2024-11-26 02:17:23.496 2465293 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:17:23Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
625Nov 26 02:17:23 cloudvirt1045 neutron-openvswitch-agent[2465293]: 2024-11-26 02:17:23.496 2465293 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
626Nov 26 02:17:44 cloudvirt1045 ovs-vsctl[2784777]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
627Nov 26 02:47:22 cloudvirt1045 neutron-openvswitch-agent[2783588]: 2024-11-26 02:47:22.558 2783588 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: 2024-11-26T02:47:22Z|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
628Nov 26 02:47:22 cloudvirt1045 neutron-openvswitch-agent[2783588]: 2024-11-26 02:47:22.559 2783588 ERROR neutron.agent.common.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: None
629Nov 26 02:47:35 cloudvirt1045 ovs-vsctl[2801462]: ovs|00002|db_ctl_base|ERR|multiple rows in Manager match "ptcp:6640:127.0.0.1"
630================
631PASS |████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 100% (37/37) [00:01<00:00, 25.45hosts/s]
632FAIL | | 0% (0/37) [00:01<?, ?hosts/s]
633100.0% (37/37) success ratio (>= 100.0% threshold) for command: 'journalctl --pri...11-23 | grep ovs'.
634100.0% (37/37) success ratio (>= 100.0% threshold) of nodes successfully executed all commands.

My current theory is that there was rollout of a puppet change, that restarted openvswitch across all hypervisors, causing a brief network outage, that was magnified by NFS:

Nov 26 02:23:34 cloudvirt1050 puppet-agent[2500937]: (/Stage[main]/Openstack::Neutron::Common::Caracal/File[/etc/neutron/neutron.conf]) Scheduling refresh of Service[neutron-openvswitch-agent]
Nov 26 02:23:35 cloudvirt1050 puppet-agent[2500937]: (/Stage[main]/Openstack::Neutron::Common::Caracal/File[/etc/neutron/plugins/ml2/ml2_conf.ini]/content)
Nov 26 02:23:35 cloudvirt1050 puppet-agent[2500937]: (/Stage[main]/Openstack::Neutron::Common::Caracal/File[/etc/neutron/plugins/ml2/ml2_conf.ini]/content) --- /etc/neutron/plugins/ml2/ml2_conf.ini        2024-06-20 13:42:49.037229616 +0000
Nov 26 02:23:35 cloudvirt1050 puppet-agent[2500937]: (/Stage[main]/Openstack::Neutron::Common::Caracal/File[/etc/neutron/plugins/ml2/ml2_conf.ini]/content) +++ /tmp/puppet-file20241126-2500937-at3827        2024-11-26 02:23:34.993579408 +0000
Nov 26 02:23:35 cloudvirt1050 puppet-agent[2500937]: (/Stage[main]/Openstack::Neutron::Common::Caracal/File[/etc/neutron/plugins/ml2/ml2_conf.ini]/content) @@ -14,7 +14,7 @@
Nov 26 02:23:35 cloudvirt1050 puppet-agent[2500937]: (/Stage[main]/Openstack::Neutron::Common::Caracal/File[/etc/neutron/plugins/ml2/ml2_conf.ini]/content)  # bugs as recent as Mitaka that cause HA gateway
Nov 26 02:23:35 cloudvirt1050 puppet-agent[2500937]: (/Stage[main]/Openstack::Neutron::Common::Caracal/File[/etc/neutron/plugins/ml2/ml2_conf.ini]/content)  # failover to not update the l2pop driver causing
Nov 26 02:23:35 cloudvirt1050 puppet-agent[2500937]: (/Stage[main]/Openstack::Neutron::Common::Caracal/File[/etc/neutron/plugins/ml2/ml2_conf.ini]/content)  # instances to contact the incorrect MAC
Nov 26 02:23:35 cloudvirt1050 puppet-agent[2500937]: (/Stage[main]/Openstack::Neutron::Common::Caracal/File[/etc/neutron/plugins/ml2/ml2_conf.ini]/content) -mechanism_drivers = linuxbridge, openvswitch, l2population
Nov 26 02:23:35 cloudvirt1050 puppet-agent[2500937]: (/Stage[main]/Openstack::Neutron::Common::Caracal/File[/etc/neutron/plugins/ml2/ml2_conf.ini]/content) +mechanism_drivers = openvswitch, l2population
Nov 26 02:23:35 cloudvirt1050 puppet-agent[2500937]: (/Stage[main]/Openstack::Neutron::Common::Caracal/File[/etc/neutron/plugins/ml2/ml2_conf.ini]/content)  extension_drivers = port_security
Nov 26 02:23:35 cloudvirt1050 puppet-agent[2500937]: (/Stage[main]/Openstack::Neutron::Common::Caracal/File[/etc/neutron/plugins/ml2/ml2_conf.ini]/content)
Nov 26 02:23:35 cloudvirt1050 puppet-agent[2500937]: (/Stage[main]/Openstack::Neutron::Common::Caracal/File[/etc/neutron/plugins/ml2/ml2_conf.ini]/content)  [ml2_type_flat]
Nov 26 02:23:35 cloudvirt1050 puppet-agent[2500937]: (/Stage[main]/Openstack::Neutron::Common::Caracal/File[/etc/neutron/plugins/ml2/ml2_conf.ini]/content) content changed '{sha256}cba64fb7a8d2b8403a65b40fdb107106a35ccf103400a8afbe854abd76b21606' to '{sha256}8d055ebd184490b8d30acd52>
Nov 26 02:23:35 cloudvirt1050 puppet-agent[2500937]: (/Stage[main]/Openstack::Neutron::Common::Caracal/File[/etc/neutron/plugins/ml2/ml2_conf.ini]) Scheduling refresh of Service[neutron-openvswitch-agent]
Nov 26 02:23:35 cloudvirt1050 ovsdb-client[2423506]: ovs|00001|fatal_signal|WARN|terminating with signal 15 (Terminated)
Nov 26 02:23:35 cloudvirt1050 systemd[1]: Stopping neutron-openvswitch-agent.service - Openstack Neutron OpenVSwitch Agent (neutron-openvswitch-agent)...

This one never got resolved https://bugs.launchpad.net/neutron/+bug/1868098 :/

My current theory is that there was rollout of a puppet change, that restarted openvswitch across all hypervisors, causing a brief network outage, that was magnified by NFS:

Could it be related also to DNS/network on the k8s side too losing connectivity?

My current theory is that there was rollout of a puppet change, that restarted openvswitch across all hypervisors, causing a brief network outage, that was magnified by NFS:

Could it be related also to DNS/network on the k8s side too losing connectivity?

Yes, I'm trying to navigate in that direction, and connect the dots, but I still don't have proofs.

Were there signs of dns/network failures outside of toolforge/k8s containers? I wasn't able to find any last night when troubleshooting.

Were there signs of dns/network failures outside of toolforge/k8s containers? I wasn't able to find any last night when troubleshooting.

We don't have a lot of monitoring outside Toolforge, so it is hard to tell.

But toolsbeta had similar problems, and @hashar reported a spike of DNS request errors in the deployment-prep monitoring

aborrero claimed this task.

the outage itself has been resolved, so resolving this ticket as well.

We can keep working on parent/sibling tickets.