Page MenuHomePhabricator

8 x SMF Patches between cages Eqiad - LVS & WMCS
Closed, ResolvedPublic

Description

So in total we have 8 connections that need to run from the old cage to the new cage in Eqiad.

LVS Links

4 between the "spine" switches (we're currently using lsw1-e1 and lsw-f1 for these waiting on the QFX5120-32Cs) and the LVS servers that are being put in place. I believe we ordered the optics and cables for all of these, but we probably need to double check.

Device ADevice A PortDevice A OpticDevice BDevice B PortDevice B Optic
lsw1-e1xe-0/0/4610GBase-LRlvs1017ens2f0np0 - first port on pci310GBase-LR
lsw1-e1xe-0/0/4710GBase-LRlvs1018ens2f0np0 - first port on pci310GBase-LR
lsw1-f1xe-0/0/4610GBase-LRlvs1019ens2f0np0 - first port on pci310GBase-LR
lsw1-f1xe-0/0/4710GBase-LRlvs1020ens2f0np0 - first port on pci310GBase-LR

We'll also need to move all of these to the Spine switches when they are installed. So if we are ordering anything from fs.com best to make sure we have the 4x10G QSFP+ modules we need to do 4x10G (or use the QSFP->SFP+ adapters). We should probably validate and discuss on call next week.

WMCS Links

We've decided to dedicate two racks in the new cage to WMCS, and configure the switches there are more "cloudsw" devices connected directly with cloudsw1-c8 and cloudsw1-d5 in the existing cage. I've opened T301414 to make a call on what racks to assign them, but either way we'll need cables run like this:

Device ADevice A PortDevice A OpticDevice BDevice B PortDevice B Optic
lsw1-e4et-0/0/5440GBaseLRcloudsw1-d5et-0/0/5240GBase-LR
lsw1-e4et-0/0/5540GBaseLRcloudsw1-c8et-0/0/5240GBase-LR
lsw1-f4et-0/0/5540GBaseLRcloudsw1-c8et-0/0/5340GBase-LR
lsw1-f4et-0/0/5440GBaseLRcloudsw1-d5et-0/0/5340GBase-LR
NOTE: Above switches will probably be renamed to 'cloudsw1-eX' and 'cloudsw1-fX' but I'm using existing 'lsw' names for now to avoid confusion.

I'm pretty sure none of the optics for these WMCS links have been purchased so we'll need to do that.

We also need to properly record the entire fiber path, with all the patch panel locations etc., and make sure it's documented correctly in Netbox (similar to open request for details of runs from new cage to CRs).

Event Timeline

cmooney changed the task status from Open to In Progress.Feb 9 2022, 10:03 PM
cmooney changed the task status from In Progress to Open.
cmooney triaged this task as Medium priority.
cmooney created this task.

updated task for new cage Rack E4,F4 will be dedicated for WMCS

@wiki_willy @RobH
we need cables for old cage to finish connection

4x 20m. SC/LC fibers
8x 40GBaseLR optics

8x 10GBase-LR
4x 15m SC/LC fibers

RobH mentioned this in Unknown Object (Task).Feb 17 2022, 5:05 PM

Change 765311 had a related patch set uploaded (by BBlack; author: BBlack):

[operations/puppet@production] eqiad lvs: add interfaces and IPs for rows E and F

https://gerrit.wikimedia.org/r/765311

RobH added a subtask: Unknown Object (Task).Feb 23 2022, 6:15 PM
RobH unsubscribed.

Change 765311 merged by BBlack:

[operations/puppet@production] eqiad lvs: add interfaces and IPs for rows E and F

https://gerrit.wikimedia.org/r/765311

Change 766824 had a related patch set uploaded (by BBlack; author: BBlack):

[operations/puppet@production] LVS: add new eqiad private tagged_subnets

https://gerrit.wikimedia.org/r/766824

Change 766824 merged by BBlack:

[operations/puppet@production] LVS: add new eqiad private tagged_subnets

https://gerrit.wikimedia.org/r/766824

Change 766826 had a related patch set uploaded (by BBlack; author: BBlack):

[operations/puppet@production] Eqiad LVS: remove [ef]4 vlans from config

https://gerrit.wikimedia.org/r/766826

Change 766826 merged by BBlack:

[operations/puppet@production] Eqiad LVS: remove [ef]4 vlans from config

https://gerrit.wikimedia.org/r/766826

Hi @Jclark-ctr - I know you finished running these cables on Monday, so just checking if we're good to resolve this task?

Thanks,
Willy

These runs have been completed and netbox has been updated with all cableids

Jclark-ctr closed subtask Unknown Object (Task) as Resolved.Mar 8 2022, 8:11 PM