Now that we have completed five Build Day Live events, there are a few patterns that have emerged and a particular area where we have steadily improved our game. If you have watched the live streams, you will know that it is always the network that gives us the most trouble. On one Build Day, I had to use my manager’s voice and say that we were not going to troubleshoot the network any further but needed to do a field expedient temporary configuration. The core of the problem is that we bring some network in the vBrownBag traveling lab and need to integrate that networking with the vendor’s lab equipment. I try to make the vBrownBag a little like a production environment, so there are VLANs that separate different networks for management, storage, and workload VMs. Most vendor environments are labs where networking is kept simple. The solution has been to expand the traveling lab network and use less of the vendor’s network. The current lab provides 10 x 10Gbe ports and 10 x 1Gbe ports for the vendor’s equipment.
Evolution of the vBrownBag Lab Network
One of the challenges with the vBrownBag mobile lab is that it travels as my checked baggage on trans-pacific flights. The 50lb maximum weight is a critical constraint on the lab design, along with the need to fit inside a 4U rack travel case.
Generation 1
The first vBrownBag lab only had a single 1Gbe Ethernet switch, and very quickly we found that network bandwidth is essential to a successful Build Day Live. Often, we move workload VMs from the vBrownBag lab onto the newly deployed infrastructure. Copying a dozen VMs over 1Gbe takes hours, not what we want in the live stream. It became clear that 10Gbe was essential to the lab.
Generation 2
The first 10Gbe switch in the lab was a NetGear XS708 with enough 10Gbe port for the Supermicro lab hosts, which have RJ45 10Gbe ports. We also needed some 1Gbe ports, so we also had a small 1Gbe switch connected to one of the RJ45 ports on the XS708. The NetGear also has a single SFP+ port, which allows another 10Gbe to be connected using a DAC, a short copper cable. The plan was that when we needed 10Gbe network for vendor equipment, the vendor would provide a switch and we would use a DAC cable to connect the two switches. While this works, it does mean we had no control of the switch and switch configuration that the vendor provided. We learned that there is always a delay getting that vendor switch configured. To connect the vendor’s equipment to our switch, we needed a switch with more 10Gbe SFP+ ports.
Generation 3
The next step up in switches was a Ubiquiti XS-16-XG, which has four RJ45 10Gbe ports plus 12 SFP+ ports. The RJ45 ports were for the Supermicro servers 10Gbe ports. A few 1Gbe SFP+ modules allowed all the 1Gbe connectivity we needed, things like IPMI interfaces on the Supermicros and the router for Internet connectivity. There were still six unused SFP+ ports that allowed 10Gbe connectivity for the vendor’s equipment. It wasn’t until we needed to attach four dual-connect 10Gbe hosts that we realized that the SFP+ ports are too valuable to use for 1Gbe.
Generation 4
The current configuration brings back the dual switch setup. The XS-16-XG provides 10Gbe, with ten available SFP+ ports. One SFP+ still offers 1Gbe connectivity, but it now goes to a 16 port 1Gbe switch. Now there are plenty of free SFP+ ports and 1Gbe ports too. This configuration has not yet done duty at a Build Day Live event, no doubt we will find the limitations of this configuration when we do.
Build Day Live Networking
The great thing is that we have always had the network configured correctly before the live stream. A few of the network issues stand out from the different events, and since we were not doing a Build Day Live event about network vendor equipment, the war stories do not reflect on the vendor’s products.
The Good
The best experience we had was integrating with the HPE solutions center in Houston, for the SimpliVity Build Day Live. The data center team there regularly integrate their network with diverse customer requirements and provided a 10Gbe link to our switch and trunked the required VLANs to our switch. Having a dedicated on-site integration engineer was great, thanks Gutiere. The SimpliVity equipment that we needed to connect with used 16 10Gbe ports and at least as many 1Gbe ports, more than the vBrownBag lab will provide.
The Bad
The Datrium Build Day Live looked like it might have a real network problem, the 10Gbe switch in their PAC kit (their servers, storage, and network in a box) was usually used as a simple layer-2 switch with no VLANs. It turned out that the switch was still running the factory firmware, so it did not have full support for 802.1q VLAN trunks with the native VLAN untagged. It was at that point that we first used the current vBrownBag lab network architecture, using our Ubiquiti switch for all of the 10Gbe and adding all the 1Gbe ports we needed with unmanaged desktop switches taken out of the Datrium lab. As a result of that experience, Datrium now builds their PAC kits with Ubiquiti switches like the one in the vBrownBag lab.
The Ugly
You may have noticed that the sound in the Pure Storage Build Day Live videos has a peculiar quality, that is from the noise reduction filter. The Arista 10Gbe switch that Pure provided was designed to run in a data center, it was seriously noisy and right behind me for the whole time. The good news is that Pure had a network engineer who could configure the switch for us, so the network setup did not hold up the dry run. After three days of the switch screaming in the room, it was a great relief to turn it off after the recording was done. This was the moment I realized that the vBrownBag lab needed more than eight Gbe ports, leading to the Ubiquiti switch.
One of the aims of Build Day Live events is to show you what it will be like if you are the engineer who must deploy a technology. Network problems are absolutely a routine challenge as new products are deployed into a data center, so it is no surprise that we have network issues with the Build Day Live events. Hopefully, the Generation four network will make it easier to accommodate new products in the vBrownBag mobile lab.