We have all been there; sitting down to dinner on a Sunday night, when you get an alert that one of your branches has gone offline. You politely excuse yourself from the table, open your laptop and start troubleshooting. The problem, well you can’t access any of the gear to troubleshoot the issue, and the gear you can access isn’t providing any answers to why the site went down. Now you are faced with two choices, do you wait for Monday morning when someone will be at the branch to help you troubleshoot, costing the company productivity and money, or do you drive out there and console into the gear yourself? If only there was a better, and faster way to fix the problem.
Out-of-Band access is far from new, it has been around for as long as there have been remote networks that we have needed to access. The issue was that it required building a second network parallel to the production network to provide said access. This cost companies more money than they were worth, so Out-of-Band access was relegated to the “Nice to Have” pile, and mostly forgotten about, or only implemented at critical locations.
Console servers are great for getting access to your networking equipment, or gaining access to power infrastructure, but if all you do is plug the console server into the same network equipment its designed to manage, then in an outage you cannot access the console server. This is why historically it was so expensive, because you had to build another network to support the console servers. Additionally, you had to provide another WAN circuit to truly isolate the Out-of-Band network from production.
Today, however, there are more practical solutions that won’t break the bank. With the proliferation of 4G/5G cellular networks, Smart OOB console servers, and centralized management software, the need for additional networking equipment is gone. Let’s talk about designing a modern Out-of-Band access network using tools from OpenGear.
The OpenGear Smart OOB suite of products include the following components:
- Infrastructure Manager: This is the heart of the OOB network. These devices are a console server, firewall, and router for both wired internet and 4G/5G cellular. This device would be placed in the MDF and serve as the console server for all equipment (including power distribution infrastructure) in the MDF plus plug into the internet switch for direct internet access, and/or connect via their 4G/5G radios to a myriad of cellular networks.
- Remote Site Gateways/ Console Servers: This is a standard console server designed for remote sites, and IDFs. These devices would be placed in each IDF and are sized depending on the amount of equipment in each IDF. These console servers are equipped with either RJ45-copper, or SFP uplink ports, which means they can directly connect back to the MDF riding over dedicated fiber or copper links. This feature alone negates the need for an additional management switch in each IDF.
- Lighthouse Central Manager: For deployments that have five or more OpenGear products, they recommend installing their Lighthouse central manager. A “single pane of glass” manager that allows you to see all of your console servers, connect to any console port, and monitor the Out-of-Band network environmental sensors for things such as smoke, heat, humidity, and water leaks. Alerting via email, or SMS is also centralized though the lighthouse server.
The actual design of the Out-of-Band network has an Infrastructure Manager (IM) in the MDF connected directly to the internet switch. The second network port on the IM is then connected to the core switch and placed in an OOB VLAN on the core switch. Console Managers (CM) are then installed in each IDF and connected via their own dedicated uplink back to the core switch and placed in the OOB VLAN, seen below in red. (note that for locations with a single IDF, the CM can be connected directly to the IM) Once the console servers are in place and connected via the OOB network, the serial connections (seen below in solid blue) are connected to each piece of equipment in the closet including UPSs and PDUs.
Routing for the OOB network is handled by the Infrastructure Manager, and accessed either through the primary internet circuit, or if that circuit or internet switch is down, though the built-in 4G LTE radio. Additionally, OpenGear has a feature called Cascade Ports, where the IM can manage the serial connections of all other console servers connected to it. This means the Infrastructure manager can manage all console connections across each IDF.
Now I know what you’re thinking; “if we don’t have an OOB management switch in each IDF how are we going to get to the managed PDUs if we need to power cycle some gear?” Well OpenGear thought of that as well. Because the console servers are plugged into the serial port of the PDUs we can control the PDUs via console. Even better, OpenGear has partnered with the leading manufactures in the power distribution space and can natively restart power ports from their GUI meaning you don’t need to remember the console commands.
With any device that can be accessed over the Internet security is a big concern. As I stated above, OpenGear devices have a built-in firewall to block unwanted traffic from getting into the Out-of-Band network. As an administrator you can identify the trusted networks the firewall should allow. Additionally, OpenGear supports OpenVPN, standard IPSec VPN, and PPTP VPN. This means that the network can be securely accessed using VPN technology. OpenVPN and PPTP VPN are used for remote client access and access for OpenGears Lighthouse software, and IPSec VPN is used for point to point access.
In the Design section I mentioned the Lighthouse central manager but never really fit it into the on-premise design. What Lighthouse does is provide a centralized view of all OpenGear and even some third-party console servers across your entire organization. By leveraging the OpenVPN connections from the OpenGear console servers, Lighthouse can manage the console connections, monitor the OOB infrastructure and environmental sensors, and alert when an issue arises.
Now that the sales pitch is over, lets talk about what Lighthouse can do in regards to zero-touch deployments. In all reality there are two things that need to be configured on each console server before deploying to the field, network IP info, and Lighthouse server info. From there each console server (or as the diagram above calls it, Enrolled Node) will call home and register with your instance of Lighthouse. Centralized policies and templates can then be deployed automatically, freeing up the need to configure every device.
Hopefully you can see how a Smart OOB network can be effective at maintaining access to devices even when the production network is down, and by eliminating the need for additional management switches at each IDF the cost to provide an Out-of-Band network is worth the expense. A device Out-of-Band access should always be considered when planning a new site build or upgrading and existing site.