What Does Edge Computing Mean?
Edge computing is a distributed network architecture that processes data as close to its source as possible in order to minimize bandwidth and reduce network latency. An important goal of edge computing is to reduce communication time between clients and servers. In some cases the data is processed on the originating device itself — and only the most important data is ever transferred off the device.
In addition to facilitating real-time data processing, the benefits of edge computing include:
- Improved response time – data doesn’t have to travel to and from a remote data center for processing.
- Bandwidth optimization – only the most important data needs to be transferred over the network.
- Security optimization – the security risk footprint is reduced because less unencrypted data is sent over the network.
Techopedia Explains Edge Computing
Edge computing is a frequent and popular means of enhancing networks to promote efficiency.
In the early days of big data, a consistent philosophy emerged and best practices, in most cases, involved routing data to a central data warehouse on site or in the cloud, where it would be stored, retrieved, analyzed and sculpted. This has remained a dominant model until recently, when “edge” data collection started to arise as a practical alternative.
To collect data near the edge of a network, businesses look far afield from the data warehouse and consider how to gather and analyze data near its source. An excellent example is in internet of things (IoT) systems, where it may not be practical to funnel a lot of device or sensor data into the data warehouse.
The pursuit of edge analytics is gaining ground in IoT architectures and other types of enterprise systems. Because companies can “thin” data or otherwise cull data results, edge data collection and analytics can help with issues such as network congestion and latency.
An intelligent device has its own computing capability so it can process data as close to its source as possible. While this is useful when an instantaneous transfer of information is essential, it also increases the risk that intelligent devices at the edge can become attack surfaces for cybersecurity threats.
To protect this new type of network node, many organizations are turning to Secure Access Service Edge (SASE) which combines software-defined wide area network (SD-WAN) capabilities with network security services. The SASE framework includes capabilities such as cloud access security brokers (CASBs), Zero Trust and next-gen firewalls as a service (FWaaS) in a single cloud service model.
Edge Computing vs. Fog Computing vs. MEC Computing
A lack of agreed-upon standards has complicated the way edge computing services are being marketed.
Although “edge” seems to be the most popular way of describing the concept of extending the cloud to the point where data originates, the competing labels Fog Computing and MEC Computing are also being used by vendors — sometimes as synonyms.
To avoid confusion, network architects recommend using the label Edge Computing when discussing the general concept of reducing latency between the data’s source and supporting compute/storage resources.
The label Fog Computing should be used when data is sent to a nearby gateway server for processing. Fog Computing is often associated with Cisco. Gateways may also be referred to as a Fog servers or Fog nodes.
The label Multi-Access Edge Computing should be used when discussing the open standards framework for edge computing that is being developed by the nonprofit group ETSI. The framework is designed to ensure developers have access to a consistent set of APIs.
Edge icons created by Freepik – Flaticon.