Simply defined, edge computing is a distributed, open IT architecture that features decentralized processing power, enabling mobile computing Internet of Things (IoT) and other low latency technologies. With edge computing, data is processed by the device itself or by a local computer or server, rather than being transmitted to a distant data center. Edge computing enables data-stream acceleration, including real-time data processing without latency. It allows smart applications and devices to respond to data almost instantaneously, as it’s being created, eliminating lag time.
Currently, data management and analysis are performed in the cloud or at a far away data center. However, in an edge computing model, sensors and connected devices transmit data to a nearby edge computing device, housed in a local data center instead of transporting it back to the cloud or a remote data center.
While still a nascent technology, edge computing allows for efficient data processing in that large amounts of data can be processed near the source, reducing Internet bandwidth usage. This both reduces costs and ensures that applications can be used effectively in remote locations. In addition, the ability to process data without ever putting it into a public cloud adds a useful layer of security for sensitive data.
So Many Things, So Much Data
To better comprehend the need for edge computing one must take into account the explosive growth in IoT systems and applications expected in the coming years. According to a report by the International Data Corporation (IDC), by next year, 45 percent of IoT-created data will be stored, processed, analyzed and acted upon close to, or at the edge of the network. Meanwhile, IHS Markit forecasts that the IoT market will grow from more than 15 billion devices three years ago to approximately 30 billion devices by 2020 and more than 75 billion in 2025.
Beyond the most frequently reported consumer use cases such as autonomous vehicles and wearables, the IoT is set to transform such industries as agriculture, construction, energy, manufacturing, healthcare, mining, public safety and utilities. Additionally, considering that many U.S. and European metro areas have launched smart city initiatives, it’s no surprise that IHS Markit predicts that global data transmissions, driven by the growth of IoT connected devices and sensors, are expected to increase from 20 to 25 percent annually to 50 percent per year, on average, over the course of the next 15 years.
With so many, many things producing so much data, it’s important to realize that in many instances these connected devices and sensors will only be effective at multigigabit speeds and sub-millisecond latency. Behind the wheel of an autonomous vehicle driving 65 miles per hour on the local interstate, the one second latency we find acceptable in our favorite app could mean the difference between the car’s sensors detecting and automatically swerving out of the path of an oncoming 18-wheeler, or an unfortunate fender-bender. That’s where edge computing, by placing processing and storage capabilities near the edge of the network, can ensure speed and latency requirements.
In the field of healthcare, Emergency Medical Services is also undergoing a transformation thanks to the advent of edge infrastructure and IoT systems and applications. Today, ambulances are primarily used as transport to a local hospital, but due to the impact of edge computing and IoT, ambulances will soon resemble mobile emergency rooms.
The next generation of EMS practitioners will be able to conduct high-definition, two-way video dialogues with emergency room staff and physicians. This real-time relay of information will allow hospital personnel to anticipate what’s coming in from the field. Moreover, edge computing and IoT systems will enable emergency medical technicians to access patient records from databases in real time. They will also be able transmit patients’ vital signs to hospitals while they are in transit, giving hospital staff the data they need to prepare for intake. This advance knowledge of the patient’s condition will allow them to have the right medical specialists and equipment on hand when the patient arrives at the facility.
When Milliseconds Matter
In these and other instances, edge computing reduces latency because data does not have to traverse over a network to a remote data center or cloud for processing. Depending on the implementation, time-sensitive data in an edge computing architecture may be processed at the point of origin by an intelligent device or sent to an intermediary server located at a data center in close geographical proximity to the client. As described above, this is ideal for situations where latencies of milliseconds can mean the difference between life and death, but also in machine-to-machine (M2M) industrial applications. Meanwhile, data that is less time-sensitive will be sent to the cloud for historical analysis, Big Data analytics and long-term storage. Augmented Reality (AR) and Virtual Reality (VR), multiplayer gaming and eCommerce will also benefit from the advantages of edge computing and IoT systems.
In addition to increased data speed through reduced computing latency, there are three additional reasons why edge computing will become essential to enterprise business operations and IT infrastructure in the age of IoT:
Locating the Edge and the Importance of 5G
Occupying the link between connected devices and the cloud, edge computing is comprised of local devices such as a network appliance or server that translates cloud storage APIs; localized data centers with one-to-ten racks that provide significant processing and storage capabilities, including prefabricated micro data centers; and regional data centers that have more than 10 racks and are located closer to the user and data source than centralized cloud data centers.
Another critical component that will facilitate edge computing will be the buildout of next-gen 5G cellular networks by telecommunication companies. In the U.S., both AT&T and Verizon are conducting local trials, and Japan, South Korea, and China are already building early-stage 5G networks. As telecom providers build 5G into their wireless networks, they will increasingly add micro data centers that are either integrated into or located adjacent to 5G towers. Enterprise and Cloud customers will then be able to own or lease space in these micro data centers for edge computing, then have direct access to a gateway into the telecom provider’s broader network, which could connect to a public IaaS cloud provider.
The Future of Next-Generation Communications
Representing a convergence that some believe is the future of next-generation communications, some tower infrastructure companies are already forging partnerships with enterprise-class data center providers to develop micro data centers at the base of communication towers that will enable edge computing. With this model, tower-based data centers bring the cloud into local areas and integrate with emerging C-RAN network architecture.
In addition to improving distribution for content providers and carriers, this method of edge computing architecture also creates an important distribution point for the cloud at a lower cost. Hence, there’s just one jump to the micro data center at the base of the towers, so not only is the latency for accessing the cloud reduced, but it opens the possibility for real-time applications and a richer more immersive experience for end users.
Recently, the industry has seen several partnerships and tests between mobile network operators (MNOs) and micro data center providers for their units to be used at cell tower sites. However, those companies that can leverage the complete trifecta of cell towers, a nationwide footprint of data centers and diverse fiber assets are best positioned to offer a platform for customers seeking a highly distributed colocation and connectivity solution in support of edge computing. Moreover, to deliver a truly valuable service to the customer, it’s not just about the towers, micro data centers and dense fiber routes pushing data, storage, and processing to the edge, but seamlessly orchestrating all of these elements to provide a fully managed end-to-end solution.
This article first appeared in Data Economy, written by Vlad Friedman, CTO at DataBank. Friedman is a seasoned IT veteran with over 25 years of mission-critical IT experience. In his role as CTO, Friedman guides direction for the development, implementation, and management of the company’s overall technology strategies.
Discover the DataBank Difference today:
Hybrid infrastructure solutions with boundless edge reach and a human touch.