Edge Computing: Explained!
Edge computing is a distributed information technology (IT) architecture in which client data is processed around the network as close as possible to the source. Data is the lifeline of modern enterprises, providing valuable business insights and supporting real-time control of critical business processes and procedures. Today’s business is full of data. Sensors and IoT devices that operate in real-time from remote locations and difficult-to-live operating environments can collect large amounts of data regularly, almost anywhere in the world. This flood of virtual data, however, is also changing the way businesses deal with computing.
Traditional computing paradigms based on centralized data centers and the everyday internet are not well suited for moving an infinitely growing stream of real data. Bandwidth limitations, latency issues, and unpredictable network disruptions can all conspire to impair such efforts. Businesses are responding to these data challenges using edge computing architecture.
In simplest terms, edge computing moves some portion of the storage and compute resources out of the central data center and closer to the source of the data itself. Rather than transmitting raw data to a central data center for processing and analysis, that work is instead performed where the data is generated—whether that’s a retail store, a factory floor, a sprawling utility, or across a smart city. The result of that computing work at the edge, such as real-time business insights, equipment maintenance predictions, or other actionable answers, is sent back to the main data center for review and other human interactions.
Edge computing is changing IT and business computing. Take a comprehensive look at what edge computing is, how it works, and its benefits.
How Does Edge Computing Work?
Edge computing is a matter of location. Traditional enterprise computing creates data on client endpoints, such as the user’s computer. This data is sent over the corporate LAN over a WAN such as the internet, and the data is stored and processed by corporate applications. The result of this work is sent back to the client endpoint. This remains a trial-and-error approach to client-server computing for most common business applications.
However, the number of devices connected to the Internet and the amount of data these devices generate and use by enterprises is growing so rapidly that the traditional data center infrastructure cannot handle them. By 2025, Gartner predicts that 75% of enterprise-generated data will be outside of a centralized data center. The prospect of moving so much data in situations that can often be time or disruption-sensitive puts an incredible strain on the global internet, which itself is often subject to congestion and disruption.
So, IT architects have shifted focus from the central data center to the logical edge of the infrastructure, taking storage and computing resources from the data center and moving those resources to the point where the data is generated. The principle is straightforward: If you can’t get the data closer to the data center, get the data center closer to the data. The concept of edge computing isn’t new, and it is rooted in decades-old ideas of remote computing, such as remote offices and branch offices where it was more reliable and efficient to place computing resources at the desired location rather than rely on a single central location.
Edge computing puts storage and servers where the data is, often requiring little more than a partial rack of gear to operate on the remote LAN to collect and process the data locally. In many cases, the computing gear is deployed in shielded or hardened enclosures to protect the gear from extremes of temperature, moisture, and other environmental conditions. Processing often involves normalizing and analyzing the data stream to look for business intelligence, and only the results of the analysis are sent back to the principal data center.
The idea of business intelligence can vary dramatically. Some examples include retail environments where video surveillance of the showroom floor might be combined with actual sales data to determine the most desirable product configuration or consumer demand. Other examples involve predictive analytics that can guide equipment maintenance and repair before actual defects or failures occur. Yet other examples are often associated with utilities such as water treatment and power generation to ensure that the equipment is functioning properly and the quality of the output is maintained.
Why Is Edge Computing So Important?
Computing tasks require good architecture, and an architecture that fits one type of computing task does not necessarily fit all types of computing tasks. Edge computing has emerged as a viable and critical architecture that supports distributed computing, ideally providing computing and storage resources close to the same physical location as the data source. In general, distributed computing models aren’t new, and the concepts of remote offices, branch offices, data center colocation, and cloud computing have long proved their value.
However, decentralization can be difficult and requires a high degree of monitoring and control that is often overlooked away from traditional centralized computer models. Edge computing is becoming more relevant as it provides effective solutions to new network problems associated with the transmission of vast amounts of data generated and consumed in today’s businesses. It’s not just about quantity, it’s a matter of time. Applications are dependent on processing and response, which are becoming more and more time sensitive.
- Bandwidth. Bandwidth is the amount of data that a network can send over time, usually in bits per second. Bandwidth is limited on all networks, and wireless communication is more limited. This means that there is a limit to the amount of data or devices that can be sent over the network. It is possible to increase network bandwidth to accommodate more devices and data, but it can be costly, yet has higher limits and does not solve other problems.
- Latency. Delay is the time it takes to send data between two points on your network. Communication is ideally at the speed of light, but long physical distances can delay the movement of data over the network in the event of network congestion or failure. This delays all analytical and decision-making processes and reduces the ability of the system to react in real-time. In the example of self-driving cars, even human life is capable of being sacrificed.
- Congestion. The internet is basically a global “network of networks”. Evolved to provide excellent universal data exchange for most everyday computing tasks, such as file sharing and simple streaming, but the amount of data connected to tens of billions of devices has overwhelmed the internet. It is very crowded and may take a long time to retransmit. In other cases, network outages can exacerbate congestion, disconnect some Internet users, and make the Internet of Things useless during an outage.
What Are the Benefits of Edge Computing?
Edge computing helps minimize bandwidth usage and server resources. Bandwidth and cloud resources are limited and expensive. Statista predicts that by 2025, more than 75 billion IoT devices will be installed worldwide as all homes and offices will be equipped with smart cameras, printers, thermostats, and even toasters. To support all these devices, a significant amount of computation needs to be moved to the edge.
Another big advantage of moving the process to the edge is reducing latency. There is a delay each time the device needs to communicate with a remote server somewhere. For example, if two colleagues in the same office chat using the IM platform, they would have to route all messages out of the building, communicate with servers around the world, and bring them back before they appear. This can cause significant delays. If this process is marginalized and the company’s internal router is responsible for sending inter-office chat, this noticeable delay does not occur. The length of these delays depends on the available bandwidth and the location of the server. However, these delays can be completely avoided by deploying more processes at the edge of the network.
In addition, edge computing can offer new features that were not previously available. For example, enterprises can use edge computing to process and analyze data at the edge. This makes it possible in real-time.
In summary, the main benefits of edge computing are:
- Latency reduction
- Reduced bandwidth usage and associated costs
- Reduced server resources and related costs
- Added functionality
Better Things Start at the Edge
Edge computing provides businesses and service providers with unprecedented opportunities to unleash the value of their data. With the right partners, companies can always get the most out of their data. With tens of thousands of edge implementations, hundreds of market-ready solutions, standards-based technologies, and the world’s most mature developer ecosystem that delivers true value, you can maximize your business potential. Are you ready to learn more?