Page MenuHomePhabricator

Switch dynamicproxy to point back to IP rather than domain names
Closed, ResolvedPublic

Description

Reduces dependence on DNS!

Event Timeline

(Alex has graciously offered to take this on!)

Change 285288 had a related patch set uploaded (by Alex Monk):
Point dyanmicproxy to IPs instead of hostnames

https://gerrit.wikimedia.org/r/285288

Patch above will point new backends to IPs instead of hostnames. Do you also want me to write a script to update all existing records @yuvipanda?

Instead of doing something custom we could probably do a restart of webservices causing a re-register

https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools/Admin#Restarting_all_webservices

I don't really know what that is but it looks tools-specific

ah, yes I was only thinking about changing webservices in tools enmass :)

@yuvipanda, @chasemp: How's this? It should cover all the existing weird data apart from non-existent hosts which are T132231: Cleanup proxies that point to nonexistent instances

import socket, sqlite3, urlparse
conn = sqlite3.connect('dynamicproxy-api-data.db')
c = conn.cursor()

def change_netloc(netloc):
    if ':' in netloc:
        host_part, port = netloc.split(':')
    else:
        host_part = netloc
        port = "80"
    try:
        new_host = socket.gethostbyname(host_part)
    except:
        print(host_part)
        return netloc
    return new_host + ":" + port

c.execute('select id, url from backend')
for id, host in c.fetchall():
    p = urlparse.urlparse(host)
    if p.scheme != '':
        new_url = p.scheme + '://' + change_netloc(p.netloc)
        if p.path != '':
            new_url += p.path
    elif p.netloc != '':
        new_url = change_netloc(p.netloc)
        if p.path != '':
            new_url += p.path
    else:
        new_url = change_netloc(p.path)
    if new_url != host:
        c.execute('update backend set url = :new_url where id = :id', {'new_url': new_url, 'id': id})

Change 285288 merged by Andrew Bogott:
Point dynamicproxy to IPs instead of hostnames

https://gerrit.wikimedia.org/r/285288

I just need to make it carry out the same change to the data in redis, @yuvipanda?

Krenair subscribed.

Mentioned in SAL (#wikimedia-cloud) [2019-03-12T03:21:41Z] <bd808> Removed redis sets with no record in the backing database (T133554)

bd808 claimed this task.
bd808 subscribed.

I purged all of the redis sets with no matching record in the database to close this out.

fix-redis.py
#!/usr/bin/env python
# -*- coding: utf-8 -*-
#
# Copyright (c) 2019 Bryan Davis and Wikimedia Foundation. All Rights Reserved.
from __future__ import print_function

import argparse

import redis
import sqlite3


def main():
    parser = argparse.ArgumentParser(description='Fix orphan redis sets')
    parser.add_argument(
        '--do-it', dest='do_it', action='store_true',
        help='Update database and redis')
    args = parser.parse_args()

    conn = sqlite3.connect('/etc/dynamicproxy-api/data.db')
    cur = conn.cursor()
    r = redis.Redis()

    cur.execute("""
    SELECT
      r.domain as domain
    FROM
      project p
      JOIN route r on p.id = r.project_id
      JOIN backend b on r.id = b.route_id
    """)
    domains = [domain for domain, in cur.fetchall()]
    conn.close()

    orphans = [
        key for key in r.scan_iter(match='frontend:*')
        if key[9:] not in domains
    ]
    for orphan in orphans:
        print(orphan)
    if args.do_it and orphans:
        r.delete(*orphans)

main()
NOTE: I messed up the first run of this and managed to purge all of the keys from redis by omitting the trim of frontend: from the if clause of the orphans list comprehension. It was easy to recover from this with sudo service uwsgi-invisible-unicorn restart which reloads everything from the database.