Edge Computing and Edge Networking: An Overview
What is Edge Computing?CDNs vs. Edge ComputingWhat is an Edge Network?What is the Network Edge?Fog Computing vs. Edge ComputingWhat is Fog Computing?Data Processing in Fog Computing vs. Edge ComputingWhy is Edge Computing Important?The Importance of Network Observability in Edge Computing and IoT Device TelemetryHow does Edge Computing Work?Use Cases and Applications of Edge ComputingUnderstanding the Pros and Cons of Edge ComputingAdvantagesChallengesKey Technologies in Edge ComputingEnhancing Edge Networking with Kentik EdgeConnectivity CostsTraffic EngineeringPeering and InterconnectionObservability for the Network Edge and Beyond
As more devices connect to the internet, so does the volume of data we generate and the speed at which we need it processed. Traditional centralized computing can’t always meet these demands. This is where edge computing and edge networking come in. These decentralized models bring data processing and storage closer to the source, leading to faster response times and better utilization of network resources. This article will explore the principles and applications of edge computing and networking and how tools like Kentik Edge can assist in effectively managing them.
What is Edge Computing?
Edge computing is a distributed computing paradigm that brings computation and data storage closer to data sources. This approach minimizes latency, enhances data processing speed, optimizes network bandwidth usage, and ensures data privacy and compliance. It contrasts with traditional cloud computing, which relies heavily on a central data center to process and store data, often resulting in latency issues and potential data privacy concerns.
Edge computing operates on the core principles of proximity, autonomy, and physicality:
- Proximity: By positioning computation and data storage closer to data sources (e.g., IoT devices), edge computing minimizes the distance data needs to travel. This proximity reduces latency and allows faster, real-time data processing and decision-making.
- Autonomy: Edge computing is designed to operate independently or with minimal reliance on centralized servers. It allows data to be processed locally, even when offline, providing continuous functionality without always needing a connection to the cloud or a central data center.
- Physicality: The physical location of processing and storage resources matters in edge computing. Processing is typically performed on devices or local edge servers rather than being sent back to a central server, reducing data transport needs and potential security vulnerabilities associated with data transmission.
Edge computing harnesses the “network edge” — the area where a device or local network interfaces with the internet. By bringing data processing closer to the devices it’s communicating with, edge computing opens up new possibilities for real-time data processing and analytics.
CDNs vs. Edge Computing
Conceptually, edge computing and CDNs (content delivery networks) share some goals and benefits. Both are concerned with locating data closer to its source of consumption to improve user experience and use network resources more efficiently. However, they differ in purpose and function:
- CDNs are typically used to transmit cached data. Their primary goal is to place data-heavy or latency-sensitive content as close to the content consumer as possible.
- Edge Computing, on the other hand, implies a broader spectrum of processing, including data analysis using machine learning, processing data from IoT devices, and other complex operations at the network edge.
Edge computing can be viewed as an evolution of CDNs, offering more complex and autonomous processing capabilities at the network edge.
What is an Edge Network?
An edge network is a distributed network architecture designed to provision computing resources at the “edge” of the network. It places processing power and data storage closer to devices, reducing reliance on centralized data centers or cloud computing services. This architecture aims to improve response times and overall application performance, especially for latency-sensitive and data-intensive applications. Here’s what an edge network generally involves:
- Decentralized Processing: Instead of relying solely on a centralized data center or cloud, edge networks facilitate processing at or near the source of data generation, often using edge servers or edge devices.
- Reduced Latency: By locating processing resources closer to data sources, edge networks can significantly reduce the time it takes for data to travel, leading to lower latency.
- Enhanced Bandwidth Usage: Edge networks allow for more efficient use of network resources by reducing the need to send all data back to a central server for processing.
- Improved Data Privacy and Compliance: With edge networks, sensitive data can be processed locally without always needing to be sent across the network, providing an added layer of privacy and control.
What is the Network Edge?
In the context of edge computing, the network edge is the area where a device or local network interfaces with the internet. It is the gateway between local computing resources and the broader internet, optimizing connectivity and data processing efficiency. The network edge is marked by the following:
- Boundary of Local and Global: The network edge is the boundary or entry point where local devices or networks communicate with the global internet.
- Data Generation and Consumption: As the connection point with the internet, the network edge is often where data is generated and consumed.
- Essential for Real-time Processing: With the rise in real-time data applications, such as autonomous driving and Internet of Things (IoT) devices, the network edge’s role in quick, efficient data processing is increasingly significant.
- Evolution of Edge Devices: As technology advances, devices at the network edge are becoming more powerful, capable of processing, analyzing, and storing data locally. This trend further solidifies the role of the network edge in the edge computing paradigm.
The term “edge” in edge computing and edge networking represents this boundary or interface, emphasizing the importance of proximity in data processing. It’s the key to unlocking real-time applications and the potential of IoT, making it a focal point in the future of networking and computing.
Fog Computing vs. Edge Computing
Fog computing, often considered an extension of cloud computing, introduces an intermediary layer between the edge and the cloud. While the terms “fog computing” and “edge computing” are sometimes used interchangeably, there are key differences:
What is Fog Computing?
Fog computing creates an intermediary layer of computing resources, known as fog nodes, between the data sources at the network edge and the remote cloud. These nodes perform several tasks:
- Data Triage: When edge devices send large amounts of data to the cloud, fog nodes receive this data and pre-process it to determine what’s essential. This triage might involve aggregating, filtering, or analyzing the data to extract critical insights.
- Efficient Data Transfer: After processing the data, the fog nodes transfer the most crucial information to the cloud for storage or further analysis. For additional local analysis, unimportant data may be discarded or retained at the fog nodes.
- Cloud Resources Conservation: By pre-processing data and transferring only what’s essential, fog computing conserves cloud storage space. It reduces the amount of data that needs to be transferred over the network, improving efficiency and saving costs.
Data Processing in Fog Computing vs. Edge Computing
The critical distinction between fog computing and edge computing lies in where and how data is processed:
- Edge Computing: Edge computing focuses on bringing computation and data storage to the devices at the network edge - as close to the data source as possible. By processing data directly on edge devices or local edge servers, edge computing enables faster response times and reduces bandwidth usage.
- Fog Computing: Fog computing introduces an additional layer of processing resources between the edge and the cloud. This layer acts as a buffer, pre-processing data from edge devices before it reaches the cloud. The aim is to optimize network efficiency, reduce cloud storage needs, and enhance overall system performance.
While edge and fog computing aim to bring processing capabilities closer to the data source, they approach this task differently. Edge computing pushes computation all the way to the edge of the network, while fog computing establishes an intermediary processing layer to optimize data transfer and storage in the cloud. Choosing between these models depends on the application’s requirements, including latency sensitivity, network bandwidth, and data privacy concerns.
Why is Edge Computing Important?
Edge computing has become essential in our increasingly connected world due to its ability to process data near its source. For applications that require real-time data analysis and decisions, like autonomous vehicles, drones, and industrial IoT, latency is a significant concern that edge computing can address effectively. By minimizing the distance that data needs to travel, it reduces latency and enables real-time insights. This attribute is invaluable in numerous scenarios, such as preventing an accident in autonomous driving or making immediate financial trades based on market conditions.
The Importance of Network Observability in Edge Computing and IoT Device Telemetry
Network observability is increasingly crucial in the complex landscape of edge computing and IoT. This concept extends beyond traditional network monitoring and is concerned with understanding the state and performance of the network based on telemetry data.
Telemetry data is the automatic measurement and transmission of data from various sources, including physical devices and digital abstractions such as servers, gateways, switches, and load balancers, to name a few. Within the scope of IoT, endpoint telemetry becomes critical, which includes data from devices at the very edge of networks. Additionally, application telemetry provides context for network operators and engineers to make sense of traffic, performance, and security within their networks.
Network observability leverages this device telemetry to gain insights into several essential metrics for network health, including uptime, bandwidth and throughput, CPU and memory utilization, and interface errors. These metrics are pivotal for operational decisions regarding cost, performance, reliability, and security.
However, even in moderately scaled businesses, the sheer volume of device telemetry data necessitates a centralized, unified data and analytics platform. This platform helps to process, store, and manage data, reduces the risk of miscommunication across distributed teams, and accelerates incident resolution. Moreover, it enables the application of AI and machine learning algorithms for automated issue detection and alerting. Network observability is integral to fully exploiting the potential of edge networking and ensuring the optimal operation of IoT networks.
For more on this topic, see our blog post, “Using Device Telemetry to Answer Questions About Your Network Health.”
How does Edge Computing Work?
Edge computing leverages various technologies and solutions to achieve its core functionality. It fundamentally relies on edge devices, servers, and data centers. Edge devices, which can range from simple sensors to complex machines, are any devices or systems that generate data. Edge devices are often “IoT” devices but could also represent any data-generating device or application. Edge servers, often positioned in an edge data center, are the workhorses that process data from edge devices.
The underlying architecture of edge computing decentralizes data processing and places a higher emphasis on local computation. Instead of sending vast amounts of raw data over a network to a centralized data center, the data is first processed locally at the network edge. This preliminary processing can involve cleaning the data, extracting features, or running complex algorithms. Only the necessary — typically much smaller — amount of data is sent to the cloud or a central server for further processing, reducing latency and bandwidth usage.
Use Cases and Applications of Edge Computing
Edge computing is transforming various industries, driving efficiency and innovation. In healthcare, for instance, it powers real-time patient monitoring and analytics, enabling quicker responses in critical situations. In manufacturing, edge computing is behind predictive maintenance, where real-time data from machines allows for timely maintenance and prevention of costly breakdowns.
Beyond these sectors, edge computing’s potential is best showcased in the rise of smart cities and autonomous vehicles. With millions of sensors and devices, cities leverage edge computing to optimize traffic, conserve energy, and enhance public safety. Similarly, autonomous vehicles rely on edge computing for immediate, real-time processing, enabling quick decisions while on the road.
Understanding the Pros and Cons of Edge Computing
Like any technology, edge computing comes with its own advantages and challenges.
- Reducing Latency: By processing data near the source, edge computing drastically cuts down on latency, enabling real-time analytics and decision-making.
- Achieving Cost Savings: It reduces the amount of data that needs to be transported to the cloud, leading to significant savings in bandwidth and associated costs.
- Improving Reliability: By decentralizing data processing, edge computing reduces the dependency on a single central server, thereby enhancing reliability.
- Upgrading Network Infrastructure: Implementing edge computing requires significant upgrades to existing network infrastructure, which can be complex and costly.
- Ensuring Security: With data being processed and stored across multiple nodes, securing all these points can be challenging.
- Managing Data: Handling vast amounts of data generated at the edge and deciding what should be processed locally versus what should be sent to the cloud can be complex.
Key Technologies in Edge Computing
Several technologies, such as edge analytics, edge AI, and edge caching, help enable edge computing. These allow for intelligent decision-making at the device level. Monitoring the health and performance of these distributed edge devices is essential. Solutions like Kentik’s network observability platform can provide insights into all parts of a network infrastructure’s traffic flow, performance, and health.
High-speed, low-latency 5G networks are playing a pivotal role in supporting edge computing deployments, propelling this technology to new heights. These networks significantly enhance the efficiency of device communication with the network edge, facilitating faster application response times. Combined with edge computing, they support applications requiring real-time processing and ultra-low latency, such as AR/VR, real-time gaming, and precision automation in various industries. (Our blog post, “The Impact of 5G on Enterprise Network Monitoring” discusses why network monitoring for performance and security considerations are crucial to the operation of hybrid enterprise networks that incorporate 5G network segments.)
Alongside 5G, “edge cloud platforms” have emerged, providing more flexibility in processing, storing, and analyzing data. Such platforms combine the advantages of edge computing and cloud computing, bringing the power of the cloud closer to the data source, thereby reducing latency and network congestion.
In this continually evolving landscape, trends and advancements consistently redefine what is possible. A prime example is the fusion of AI and edge computing, often termed edge AI. This combination gives rise to smart devices capable of sophisticated autonomous operation without continuously relying on cloud connectivity. Consider smart drones, autonomous tractors in agriculture, or medical devices that can instantly make life-saving decisions — all of these innovations are a testament to the power and potential of edge computing.
Enhancing Edge Networking with Kentik Edge
In an era of growing demand for faster, more secure, and more efficient data processing, tools like Kentik Edge have emerged as a powerful ally for businesses looking to optimize their edge computing and edge networking strategies. Kentik Edge provides a comprehensive platform for businesses to understand network utilization better and manage associated costs, especially those related to external network traffic.
Kentik Edge helps NetOps teams — including network strategists, network architects, and peering coordinators — discover edge connection opportunities, control edge costs, and plan network capacity with the following features:
The Connectivity Costs module of Kentik Edge empowers businesses to forecast and manage their network operational costs accurately. It tracks traffic entering or leaving external interfaces, such as transit or peering. It uses provider pricing models and traffic volume measurements to predict costs for external connectivity, enabling businesses to anticipate and plan for network expenditures effectively.
Additionally, the module helps identify any discrepancies in billing statements and understand changes in operational costs, thereby preventing unexpected charges. It allows for a more strategic approach in choosing peering partners or transit providers based on the financial implications of working with them.
The Traffic Engineering module of Kentik Edge focuses on network efficiency and cost control. It analyzes link capacity and identifies congested interfaces, preventing network congestion and potential downtime. The module further aids in rerouting excess traffic to optimal links, ensuring a smoother, uninterrupted network performance. Its support for BGP traffic engineering allows the identification of groups of prefixes impacting network performance. Kentik Edge facilitates preemptive action, helping to avoid potential capacity crises and maintain network performance.
Peering and Interconnection
The Peering & Interconnection module of Kentik Edge optimizes network connectivity and reduces costs by identifying opportunities for direct interconnection between networks. This leads to minimized transmission and processing delays, enhancing overall network performance.
The module incorporates PeeringDB data, giving businesses contextual information about peering policies, traffic ratios, and common footprints. This information is crucial in making informed decisions about potential peering partners. Moreover, the module provides tools to measure traffic ratios effectively, enforcing peering agreements and ensuring balanced network relationships.
Observability for the Network Edge and Beyond
Tools like Kentik Edge offer businesses an opportunity to harness the full potential of their network resources. From mapping contract models and traffic data to track interconnection costs, identifying potential networks for direct interconnection, and collecting diverse streaming telemetry from IoT devices, Kentik delivers a data-driven solution to optimizing network capacity, improving performance, and reducing costs. To see how Kentik can bring the benefits of network observability to your organization, start a free trial or request a personalized demo today.