Page MenuHomePhabricator

Dump instance info as a static file updated periodically
Closed, DeclinedPublic

Description

The only real way to do service discovery on labs right now is to query the wikitech API, which allows us to list instances per project. This, however, is going away at some point as we try to get rid of OpenStackManager from wikitech. It also has some load concerns, since every hit queries the nova API.

I propose the following instead:

Have a cron job that runs every 5-10mins, generates all the info we could possibly need in a nice JSON format and dump it somewhere it could be server as a static file. Code would then just hit this to get the data it needs. This would run on silver. This would also allow us to do things that we can't do in code that can be called arbitrary number of times.

Use cases:

  1. Prometheus service discovery (hits wikitech API now)
  2. Shinken service discovery (hits wikitech API now)
  3. clush service discovery (hits wikitech API now)

Event Timeline

Change 319788 had a related patch set uploaded (by Yuvipanda):
labs: Add a script that dumps instance info onto a public url

https://gerrit.wikimedia.org/r/319788

Change 319788 merged by Yuvipanda:
labs: Add a script that dumps instance info onto a public url

https://gerrit.wikimedia.org/r/319788

The script works, but is disabled right now. I need to figure out where to put the output JSON. Options are:

  1. Somewhere under wikitech.wikimedia.org
  2. Somewhere under horizon.wikimedia.org
  3. An entirely new domain just for this

Thoughts?

The script works, but is disabled right now. I need to figure out where to put the output JSON. Options are:

  1. Somewhere under wikitech.wikimedia.org
  2. Somewhere under horizon.wikimedia.org
  3. An entirely new domain just for this

Thoughts?

Under horizon is better than under wikitech in the longer term I think unless it can just be made into a tool?

The script works, but is disabled right now. I need to figure out where to put the output JSON. Options are:

  1. Somewhere under wikitech.wikimedia.org
  2. Somewhere under horizon.wikimedia.org
  3. An entirely new domain just for this

Thoughts?

Under horizon is better than under wikitech in the longer term I think unless it can just be made into a tool?

As in, running on the tools hosts (inside labs)? That depends on T150092 + T104588 which probably pretty much obsolete this (except in cross-project cases for performance reasons, I suppose). I agree it makes more sense to put this under horizon.wikimedia.org.

Is this moot now that we have api observer access?

Is this moot now that we have api observer access?

Maybe it needs to be repurposed to document using the api observer stuff? Or is that already done somewhere that I have forgotten about? It could even go so far as to take care of the 3 documented wikitech api consumers from the description, but that might be better done with separate tickets.