Decentralized IT architecture
Compared to traditional forms of compute, edge computing offers businesses and other organizations a faster, more efficient way to process data using enterprise-grade applications. In the past, edge points generated massive amounts of data that often went unused. Now that IT architecture can be decentralized with mobile computing and the Internet of Things (IoT), companies can gain near real-time insights with less latency and lower cloud server bandwidth demands—all while adding an additional layer of security for sensitive data.
Next Evolution of Cloud Computing
In many ways, edge computing is the next evolution of cloud computing, with the rise of 5G networks across the country and around the world. Now more companies than ever before can harness comprehensive data analysis without the IT infrastructure needed in previous generations. Likewise, edge computing has many possible applications, including security and medical monitoring, self-driving vehicles, video conferencing, and enhanced customer experiences.
The evolution of edge computing
The origins of edge computing are in the 1990s with the creation of the first content delivery network (CDN), which put data collecting nodes closer to end users. But this technology was limited to images and videos, not massive workloads of data. In the 2000s, the increased shift to mobile and early smart devices increased the strain on existing IT infrastructure. Creations such as pervasive computing and peer-to-peer overlay networks sought to alleviate some of that strain.
However, it wasn’t until the mainstream application of cloud computing that true decentralization of IT began, giving end users enterprise-level processing power with increased flexibility, on-demand scalability, and collaboration from anywhere in the world.
Yet, with more end users demanding cloud-based applications and more businesses working from multiple locations, it became necessary to process more data outside of the data center right at the source and manage it from one central location. That’s when mobile edge computing became a reality.
DEFINITION OF EDGE COMPUTING.
One definition of edge computing is any type of computer program that delivers low latency nearer to the requests. Karim Arabi, in an IEEE DAC 2014 Keynote and subsequently in an invited talk at MIT’s MTL Seminar in 2015, defined edge computing broadly as all computing outside the cloud happening at the edge of the network, and more specifically in applications where real-time processing of data is required. In his definition, cloud computing operates on big data while edge computing operates on “instant data” that is real-time data generated by sensors or users.
The term is often used synonymously with fog computing.
According to The State of the Edge report, edge computing concentrates on servers “in proximity to the last mile network.”.Alex Reznik, Chair of the ETSI MEC ISG standards committee loosely defines the term: “anything that’s not a traditional data center could be the ‘edge’ to somebody.”
Edge nodes used for game streaming are known as gamelets, which are usually one or two hops away from the client. Per Anand and Edwin say “the edge node is mostly one or two hops away from the mobile client to meet the response time constraints for real-time games’ in the cloud gaming context.”
Edge computing may employ virtualization technology to make it easier to deploy and run a wide range of applications on edge servers.
In a similar way, the aim of edge computing is to move the computation away from data centers towards the edge of the network, exploiting smart objects, mobile phones, or network gateways to perform tasks and provide services on behalf of the cloud. By moving services to the edge, it is possible to provide content caching, service delivery, persistent data storage, and IoT management resulting in better response times and transfer rates. At the same time, distributing the logic to different network nodes introduces new issues and challenges.