Page MenuHomePhabricator

prometheus-openstack-exporter No module named 'urlparse'
Closed, ResolvedPublic

Description

as part of the port to python3, prometheus-openstack-exporter probably needs to switch to 'from urllib.parse'.

In theory 2to3 should've caught this, maybe we can just run the whole library through 2to3 to see if there are other missing bits.

Related Objects

StatusSubtypeAssignedTask
ResolvedAndrew
ResolvedAndrew
Resolvedrook
ResolvedAndrew
Resolvedaborrero
Resolveddcaro
Resolvedaborrero
ResolvedAndrew
ResolvedAndrew
ResolvedAndrew
Resolvedtaavi
Resolvedtaavi
Resolvedtaavi
Resolvedtaavi
Resolvedtaavi
ResolvedAndrew
Resolvedtaavi
ResolvedAndrew
Resolvedaborrero
Resolvedaborrero
Duplicateaborrero

Event Timeline

There are several changes required in that software. Opened upstream issue: https://github.com/canonical/prometheus-openstack-exporter/issues/109

Let's see if they respond, otherwise we may consider doing the work ourselves.

I would recommend switching to https://github.com/openstack-exporter/openstack-exporter if possible. That codebase appears to be used largely as a snap for consumption as a juju charm (https://charmhub.io/prometheus-openstack-exporter) for the managed openstack Canonical offers.

I would recommend switching to https://github.com/openstack-exporter/openstack-exporter if possible. That codebase appears to be used largely as a snap for consumption as a juju charm (https://charmhub.io/prometheus-openstack-exporter) for the managed openstack Canonical offers.

Thanks. This looks more advanced than the exporter we currently have (more metrics, etc)

Apparently there is no .deb package for it, so if we wanted to migrate first step would be to evaluate how we want/can deploy this.

Change 771628 had a related patch set uploaded (by Majavah; author: Majavah):

[operations/debs/prometheus-openstack-exporter@master] Fixes to run on bullseye

https://gerrit.wikimedia.org/r/771628

Change 771628 merged by Arturo Borrero Gonzalez:

[operations/debs/prometheus-openstack-exporter@master] Fixes to run on bullseye

https://gerrit.wikimedia.org/r/771628

Mentioned in SAL (#wikimedia-operations) [2022-03-17T17:41:27Z] <arturo> uploaded prometheus-openstack-exporter 0.0.8-4~wmf1 to bullseye-wikimedia (T302178)

@Majavah thanks for the patch, really appreciated.

I rebuilt our internal package with it, and uploaded it to apt.wikimedia.org. However, apt.wikimedia.org already contains version 0.1.4, so 0.0.8 wont get served (only serves last one).

Morever, 0.1.4 is what we have in debian upstream (https://tracker.debian.org/pkg/prometheus-openstack-exporter), which is also pending an update to upstream 0.1.5.

I'm still undecided which way to take, either to try harder with the current debian package or go the https://github.com/openstack-exporter/openstack-exporter route, which involves starting a packaging effort from scratch.

@aborrero Ok, thanks for the pointer about the package in Debian upstream. I updated that repository to 0.1.5 and applied the same 2to3 patch here: https://gitlab.wikimedia.org/taavi/prometheus-openstack-exporter/

I also took a look at https://github.com/openstack-exporter/openstack-exporter. It has multiple layers of dependencies that are not yet packaged in Debian.

That last one is written in go, tried to build it on cloudcontrol2001-dev (git clone + go build) and worked, generating one singe self-contained binary, we can use that I guess, 15M of binary, but better than nothing.

That last one is written in go, tried to build it on cloudcontrol2001-dev (git clone + go build) and worked, generating one singe self-contained binary, we can use that I guess, 15M of binary, but better than nothing.

Yeah, and technically we could also just build an internal .deb with the dependencies downloaded from the internet during the build process.

In T302178#7788338, @Majavah wrote:

That last one is written in go, tried to build it on cloudcontrol2001-dev (git clone + go build) and worked, generating one singe self-contained binary, we can use that I guess, 15M of binary, but better than nothing.

Yeah, and technically we could also just build an internal .deb with the dependencies downloaded from the internet during the build process.

That's what I meant yes, though might not be good enough for upstream debian, is enough for us, it could not depend on anything (maybe glib or something though).

aborrero changed the task status from Open to In Progress.Apr 6 2022, 11:35 AM
aborrero triaged this task as Low priority.
aborrero moved this task from Inbox to Doing on the cloud-services-team (Kanban) board.

The packaging is ready. Next steps are:

  • upload to our repo (reprepro likely)
  • deploy it via puppet
  • configure it via puppet
  • refresh grafana dashboard to use the new metrics

Change 778488 had a related patch set uploaded (by Arturo Borrero Gonzalez; author: Arturo Borrero Gonzalez):

[operations/puppet@production] aptrepo: introduce bullseye-wikimedia/component/prometheus-openstack-exporter

https://gerrit.wikimedia.org/r/778488

Change 778504 had a related patch set uploaded (by Arturo Borrero Gonzalez; author: Arturo Borrero Gonzalez):

[operations/puppet@production] prometheus-openstack-exporter: refresh profile for the new exporter

https://gerrit.wikimedia.org/r/778504

Change 778504 merged by Arturo Borrero Gonzalez:

[operations/puppet@production] prometheus-openstack-exporter: refresh profile for the new exporter

https://gerrit.wikimedia.org/r/778504

Change 778488 merged by Arturo Borrero Gonzalez:

[operations/puppet@production] aptrepo: introduce bullseye-wikimedia/component/prometheus-openstack-exporter

https://gerrit.wikimedia.org/r/778488

Change 768747 had a related patch set uploaded (by Majavah; author: Majavah):

[operations/puppet@production] P:wmcs::prometheus: use a single entry for openstack-exporter

https://gerrit.wikimedia.org/r/768747

A few things detected upon the initial deployment:

  • request rate to openstack API has significantly increased.
    • this is probably because there is no caching (i.e, every exporter run fetches and generates all metrics)
    • also because we run the exporter in all 3 cloudcontrol nodes
  • prometheus is still seeing the exporter as down, because it takes a lot of time to return the GET to /metrics, which can also be related to the caching issue mentioned above.

Potential solutions:

  • don't run the exporter in all 3 nodes (pick one)
  • introduce some caching, and have the exporter dump the metrics to a .prom file exported using the blackbox exporter

Change 768747 merged by Arturo Borrero Gonzalez:

[operations/puppet@production] P:wmcs::prometheus: use a single entry for openstack-exporter

https://gerrit.wikimedia.org/r/768747

Mentioned in SAL (#wikimedia-operations) [2022-04-12T15:44:59Z] <arturo> removed a bunch of old src & binary packages for prometheus-openstack-exporter (T302178)

Mentioned in SAL (#wikimedia-operations) [2022-04-12T15:49:47Z] <arturo> aborrero@apt1001:~ $ sudo -i reprepro -C component/prometheus-openstack-exporter includedeb bullseye-wikimedia ${PWD}/prometheus-openstack-exporter_1.5.0-1_amd64.deb (T302178)

Change 779515 had a related patch set uploaded (by Arturo Borrero Gonzalez; author: Arturo Borrero Gonzalez):

[operations/puppet@production] prometheus-openstack-exporter: introduce some caching logic in the wrapper

https://gerrit.wikimedia.org/r/779515

Change 779516 had a related patch set uploaded (by Arturo Borrero Gonzalez; author: Arturo Borrero Gonzalez):

[operations/puppet@production] prometheus-openstack-exporter: only run it on the primary server

https://gerrit.wikimedia.org/r/779516

current status:

  • the exporter was deployed into the 3 cloudcontrol servers
  • the API request rate increased significantly, but no problems were detected (meaning, the new API hammering didn't bring down the service)
  • the exporter takes a long time to return the metrics and prometheus will timeout scraping it, so currently exploring options

current action items:

  • stop the exporter from running on all 3 cloudcontrol servers. Initial idea in patch https://gerrit.wikimedia.org/r/779516
  • explore ways to make the metrics available for scraping from the prometheus server without timeouts
    • adding an intermediate file cache and using node-exporter (this is patch https://gerrit.wikimedia.org/r/779515)
    • having different exporters serving metrics in different ports for different openstack services (like, :12345/metrics for nova, :12346/metrics for neutron, etc). This is just speculation, untested as of this writing.
    • having prometheus run different scrape jobs, filtering information, like "https://whatever:12345/probe?cloud=eqiad1&include_services=compute". This is just speculation, untested as of this writing.
dcaro changed the task status from In Progress to Open.Apr 14 2022, 8:31 AM
dcaro claimed this task.
dcaro added a project: User-dcaro.
dcaro moved this task from To refine to Today on the User-dcaro board.

Change 779516 merged by Andrew Bogott:

[operations/puppet@production] prometheus-openstack-exporter: only run it on the primary server

https://gerrit.wikimedia.org/r/779516

Change 788694 had a related patch set uploaded (by David Caro; author: David Caro):

[operations/puppet@production] openstack_exporter: don't use ensure twice for service

https://gerrit.wikimedia.org/r/788694

Change 788694 merged by David Caro:

[operations/puppet@production] openstack_exporter: don't use ensure twice for service

https://gerrit.wikimedia.org/r/788694

Change 791033 had a related patch set uploaded (by Majavah; author: Majavah):

[operations/puppet@production] P:wmcs::prometheus: increase openstack-exporter timeouts

https://gerrit.wikimedia.org/r/791033

Change 791033 merged by David Caro:

[operations/puppet@production] P:wmcs::prometheus: increase openstack-exporter timeouts

https://gerrit.wikimedia.org/r/791033

Change 779515 abandoned by Arturo Borrero Gonzalez:

[operations/puppet@production] prometheus-openstack-exporter: introduce some caching logic in the wrapper

Reason:

not working on this at the moment, also not sure we need it anymore.

https://gerrit.wikimedia.org/r/779515