Definition - What does Edge Computing mean?
Edge computing in IT is defined as the deployment of data-handling activities or other network operations away from centralized and always-connected network segments, and toward individual sources of data capture, such as endpoints like laptops, tablets or smartphones. Through this type of network engineering, IT professionals hope to improve network security and enhance other network outcomes.
Techopedia explains Edge Computing
Generally, the term "edge computing" is used as a kind of catch-all for various networking technologies including peer-to-peer networking or ad hoc networking, as well as various types of cloud setups and other distributed systems. One other predominant type of edge networking is mobile edge networking or computing, an architecture that utilizes the edge of the cellular network for operations.
One of the major uses of edge computing is to improve network security. There is a lot of concern about security architecture in the internet of things age, where more and more diverse devices are getting different kinds of access to a network. One strategy is to pursue edge computing to aggregate data further out, and encrypt it as it passes further in, for example, through firewalls and perimeters.
Edge computing can also decrease the distance that data must travel in a network, or help with a detailed network virtualization model.
Edge computing works in various ways, and contributes to IT architectures in different capacities. It is a frequent and popular means of enhancing networks to promote efficiency and more capable security for business systems.
Join thousands of others with our weekly newsletter
The 4th Era of IT Infrastructure: Superconverged Systems:
Approaches and Benefits of Network Virtualization:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: