Reduces dependence on DNS!
Description
Details
Subject | Repo | Branch | Lines +/- | |
---|---|---|---|---|
Point dynamicproxy to IPs instead of hostnames | operations/puppet | production | +2 -3 |
Status | Subtype | Assigned | Task | ||
---|---|---|---|---|---|
Resolved | • chasemp | T136073 novaproxy 502's due to intermittent DNS failures | |||
Resolved | bd808 | T133554 Switch dynamicproxy to point back to IP rather than domain names |
Event Timeline
Change 285288 had a related patch set uploaded (by Alex Monk):
Point dyanmicproxy to IPs instead of hostnames
Patch above will point new backends to IPs instead of hostnames. Do you also want me to write a script to update all existing records @yuvipanda?
Instead of doing something custom we could probably do a restart of webservices causing a re-register
https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools/Admin#Restarting_all_webservices
@yuvipanda, @chasemp: How's this? It should cover all the existing weird data apart from non-existent hosts which are T132231: Cleanup proxies that point to nonexistent instances
import socket, sqlite3, urlparse conn = sqlite3.connect('dynamicproxy-api-data.db') c = conn.cursor() def change_netloc(netloc): if ':' in netloc: host_part, port = netloc.split(':') else: host_part = netloc port = "80" try: new_host = socket.gethostbyname(host_part) except: print(host_part) return netloc return new_host + ":" + port c.execute('select id, url from backend') for id, host in c.fetchall(): p = urlparse.urlparse(host) if p.scheme != '': new_url = p.scheme + '://' + change_netloc(p.netloc) if p.path != '': new_url += p.path elif p.netloc != '': new_url = change_netloc(p.netloc) if p.path != '': new_url += p.path else: new_url = change_netloc(p.path) if new_url != host: c.execute('update backend set url = :new_url where id = :id', {'new_url': new_url, 'id': id})
Change 285288 merged by Andrew Bogott:
Point dynamicproxy to IPs instead of hostnames
I think https://phabricator.wikimedia.org/T133554#2243928 and then do the same in redis as well.
Mentioned in SAL (#wikimedia-cloud) [2019-03-12T03:21:41Z] <bd808> Removed redis sets with no record in the backing database (T133554)
I purged all of the redis sets with no matching record in the database to close this out.
#!/usr/bin/env python # -*- coding: utf-8 -*- # # Copyright (c) 2019 Bryan Davis and Wikimedia Foundation. All Rights Reserved. from __future__ import print_function import argparse import redis import sqlite3 def main(): parser = argparse.ArgumentParser(description='Fix orphan redis sets') parser.add_argument( '--do-it', dest='do_it', action='store_true', help='Update database and redis') args = parser.parse_args() conn = sqlite3.connect('/etc/dynamicproxy-api/data.db') cur = conn.cursor() r = redis.Redis() cur.execute(""" SELECT r.domain as domain FROM project p JOIN route r on p.id = r.project_id JOIN backend b on r.id = b.route_id """) domains = [domain for domain, in cur.fetchall()] conn.close() orphans = [ key for key in r.scan_iter(match='frontend:*') if key[9:] not in domains ] for orphan in orphans: print(orphan) if args.do_it and orphans: r.delete(*orphans) main()