Edge of Network – The New Data Superhighway
A specialist surveyor is using a drone to assess the area just outside of San Francisco for potential wildfire vulnerability. A flare up would not only endanger local wildlife in a protected habitat, but it would threaten multiple suburban hillside communities northwest of the bay. She needs steady hands and keen eyes for such precision work.
The twist? She’s sitting 400 miles away in her downtown Los Angeles office.
It allows one of the best drone surveyors in the country to get eyes on the environment, and make her recommendations to local fire services as they work together to mitigate the potential disaster.
Edge of network setups that stretch beyond a single metropolitan area require specific hardware and expertise. Specialized network setups close to the demarcation point, colocation in data centers, and un-congested bandwidth are all components of such a setup.
The Edge Networking Process
There are quite a few benefits to edge networking setups like this. There’s no bandwidth contention on a common ‘pipe’, no shared switches, routers, or firewalls that traffic has to pass through, and no load balancers splitting off workloads to virtual machines on a shared processor farm.
Instead, any local servers that the client wishes to run can also be located right on the edge of demarcation, a single hop away from their personal branch router. This provides direct backbone access for their latency sensitive applications that need to share data across their national network. This kind of hosting is virtually on-premises, but without the need for client-built and maintained infrastructure; no server room tucked awkwardly in a corner of the office, no ‘wire closet’ stuffed to the brim with networking gear. They don’t have to contend with the HVAC or energy requirements. They don’t have to deal with the security or maintenance. And their local liaison is often just one of the DevOps folks instead of a dedicated SysAdmin. Colocation in data centers comes with a lot of benefits.
In order to provide high-speed access to other resources that are connected to an edge of network setup, the data center and network provider needs to bring a few different skills and resources to the table.
They will have to find hosting facilities for the client’s hardware and servers (real or virtual) that are as close to the demarcation point of each connected facility as possible. Sometimes they’ll be lucky enough to have access to reasonable ‘middle’ points that can serve two branch offices in adjacent regions.
The colocation provider either has to own or sublet server and network space at every endpoint on the customer’s high speed edge network. That might entail a regional effort for smaller companies. But for larger ones, the hosting service might span multiple cities in a state, or cross national borders. In the future they may even need to make use of hybrid networking solutions such as LEO Internet, if they wish to connect up to a more remote location at the maximum possible speed.
Establishing network capabilities and colocation in data centers:that are close to each ‘spoke’ on a company’s network topology allows them to perform some rather specialized applications. For example:
Remote Surgery: Performing surgery remotely can only be done with a high speed edge of network setup. Without instantaneously being able to see the result of a laparoscopic incision, and without millisecond feedback from patient monitoring, there’s no way that a doctor could perform a complex procedure safely.
Investment Banking: Bankers and traders require the lowest possible latency so that they can make massive trades at a split second’s notice. They’ve been known to spend tens of millions of dollars just to shave a few milliseconds off of their ping times to the major stock exchanges. And of course, instantly reporting on and logging every action provides a level of accountability that survives government scrutiny.
Telecommunication and Multimedia Applications: Latency is the enemy of communication. Multimedia and telecommunications companies prefer edge networking setups because any time two people are communicating, the lag time is essentially doubled because conversations involve both speaking and listening (bi-directional). And of course, larger group collaborations are facilitated by high bandwidth, low latency file and video sharing.
Machine Learning and Artificial Intelligence: Being able to ingest real time data streams is important to many ML and AI applications. Events management is one example of this: They’re able to take the feedback of millions of Internet viewers and make on the fly adjustments to everything from sound volume to programming choices. Another example is national political polling, such as exit polling, which can provide accurate predictive models updated at a moment’s notice.
By finding a reliable IP transit expert who provides colocation in data centers, a company can create their own edge networking setup. Avoiding shared public resources is important for additional speed, security, and data privacy. Edge of network solutions open up a door that simply isn’t there in setups that have higher latency. After experiencing edge network communications, customers often end up redefining what they consider ‘real time’ applications.