The emergence of data-intensive technology, such as virtual and augmented reality, autonomous vehicles, and generative AI, has created much innovation and opportunity. Still, it has also put an increased strain on existing data center capacity.
As a result, IT infrastructure has shifted to a hybrid model – requiring sophisticated management.
However, with the rise of artificial intelligence in an edge computing environment, data processing is no longer confined to core data centers and centralized clouds, says Pierluca Chiodelli, vice president of engineering technology for edge computing offers, strategy, and execution at Dell Technologies.
Instead, it occurs closer to the data source, at the network’s edge, allowing for real-time decision-making and reducing the need to transmit massive amounts of data back to centralized locations.
“As a result, organizations must adopt a highly refined and advanced approach to managing workloads and data efficiently, securely, and intelligently across their entire IT estate,” Chiodelli explains.
“It’s essential for harnessing the full potential of data-intensive technologies while addressing the unique challenges posed by edge AI integration.”
In its new study, “How Edge Computing Is Enabling the Future,” Schneider Electric surveyed over 1,000 IT decision makers and found that 49% named managing hybrid IT infrastructure as their top IT challenge, and they expect edge computing to improve several key factors, such as speed, data security, and resiliency.
“Increasing data volume has also driven more data processing, putting a greater strain on carbon emissions and organizational sustainability,” according to the survey.
The decision-makers believe edge computing can help drive sustainability and achieve their companies’ environmental, social, and corporate governance goals.
Consequently, as organizational data increases and IT infrastructure expands in complexity, it will be critical for organizations to identify how they can track and measure energy at the edge, says Carsten Baumann, director of strategic initiatives and solution architect at Schneider Electric.
Low Latency + More Reliability = Faster Response Times
Edge computing allows data to be processed close to the source of where the information is coming from, which means faster service and more reliability, which leads to better response times when companies are using applications or programs, says Adonay Cervantes, global field CTO, CloudBlue, a multi-tier commerce platform.
“And because these applications operate on the network’s edge, they perform better with low latency,” he says.
Lee Ziliak, field chief technology officer and managing director of architecture at IT solutions provider SHI International, agrees with this assessment.
“Using data on the edge also allows an organization to analyze and predict from time-series data, increase monitoring capabilities, enhance performance, and drive greater value by mining fresh data points,” he explains. “This saves time and money by aggregating and keeping only the important data.”
Regardless of workload, companies adopt edge computing because some product features cannot use the cloud due to practical or regulatory constraints, says David Kinney, senior principal architect at IT services company SPR.
He adds that the most common practical constraints that motivate the adoption of edge computing are when communication between the edge and the cloud introduces too much latency or when the communication medium is slow or unreliable.
“Latency is a core consideration for many systems that control machinery, such as the collision-avoidance systems in new cars,” Kinney says. “For many of these systems, delaying action by even a fraction of a second can have catastrophic consequences, so critical computations must be done at the edge.”
In terms of regulatory constraints, he says this often comes up for medical devices. Medical equipment that a patient relies upon for their life or health, such as an insulin pump, must continue working even when it cannot communicate with the cloud.
Tackling the Challenges of Data-Intensive Tech
Edge computing also helps reduce the costs associated with the transfer and storage of data, according to Saurabh Mishra, global director of IoT product management at SAS, a provider of analytics software.
“A massive amount of data is being created at the edge, and a good chunk of it is sensor-based,” he says. “This data may be redundant and its value short-lived.
“Instead of transferring and storing this data in the cloud and incurring associated costs, organizations are better off using edge computing to process that data locally at the edge and only transmit key events back to the cloud.”
More companies are combining edge computing and centralized data center processing in a hybrid model to tackle the challenges of data-intensive technologies, such as augmented reality, virtual reality, autonomous vehicles, and advanced AI applications – data-hungry applications that require complex real-time data analysis to operate successfully, says Bob Brauer, founder and CEO of Interzoid, a data usability consultancy.
He adds that a cloud-only approach or a completely centralized approach would introduce a significant amount of latency into the use of these data-intensive technologies, making them less effective, less reliable, and possibly even unsafe, especially in the cases of self-driving vehicles or healthcare applications.
However, The hybrid solution enables heavy-duty data crunching, such as building AI models, to occur on a powerful in-house system where infrastructure costs are generally cheaper and more scalable than they would be out on shared cloud infrastructure environments, Brauer says.
“Then once AI models are complete, exhaustive, and well-tested, they can be rolled out to lighter weight data nodes on the edge to be applied and made available much closer geographically to the systems, devices, and vehicles that are using these models,” he says.
As such, organizations can make instantaneous decisions without having to rely on communicating with centralized servers physically located somewhere else in the world. According to Brauer, this approach drastically reduces the latency risk without sacrificing the quality of the core AI models.
Damien Boudaliez, senior vice president and global head of data solutions engineering at FactSet, a financial data and software company, describes how edge computing helps his company operate more efficiently.
“FactSet’s ticker plant cloud journey aimed to minimize latency in the real-time distribution of financial data,” he says. “Utilizing edge computing allows us to place data closer to global clients, thus optimizing performance, especially in regions like Asia where market distances present challenges.”
In addition, edge computing complements FactSet’s hybrid cloud model by enabling choice.
“We can use on-premises resources for heavy, predictable computing tasks and the cloud for more dynamic, location-sensitive needs,” Boudaliez says. “The strategy enhances the performance of both our external clients and internal teams. By situating computational resources closer to the clients and our global offices, we minimize latency and maximize efficiency.”
As the adoption of edge computing continues to expand across industries, so do the intricacies and demands of managing edge operations, says Dell’s Chiodelli.
“The edge environment is inherently distributed, presenting organizations with the dual challenge of wanting to collect and secure data at its source while grappling with limited IT expertise,” he says.
This complexity extends to the management and security of diverse edge deployments across many devices and locations, according to Chiodelli. Organizations need a streamlined approach to overseeing and securing their sprawling ecosystems of edge devices and applications.
While models that employ edge servers provide flexibility and control, this approach is not without essential considerations, specifically the management of technology at the edge, says Kelly Malone, chief business officer at Taqtile, an augmented reality software company.
“Devices and servers at the edge must be updated, synced, and managed, which can be complicated as this equipment is not, by definition of the edge approach, centrally located,” Malone says.
And as companies continue to dive into metaverse technologies, allowing them to collaborate on new levels and bring more efficiency to workers than ever before, they will need to adopt more edge-like technology to handle the amount of computing needed to have low latency and improve performance,” says Michael McNerney, vice president of network security at technology company Supermicro.
“Not only is lower latency required to make decisions at the edge but there is less bandwidth needed so companies can handle more devices at the same bandwidth,” he says.
Without edge technology, devices operating at the edge would suffer from issues with latency, cause bottlenecks in company networks, and other processing-related challenges, says Sharad Varshney, CEO of OvalEdge, a data governance consultancy.
“However, it’s important to remember that edge computing is a framework that requires internal cultural changes if you want it to work in your organization,” he adds.
“Beyond this, edge computing is one of many solutions you should look into when streamlining data use in your organization.”