Artificial intelligence (AI) currently only includes edge computing as 5% of its makeup, but some estimates predict we will see this rise to 50% over the coming years.
Edge infrastructure is scaling globally driven by a set of factors, with a key cause being data center usage rates at all-time highs across some of the most AI-intensive regions in the world, including Silicon Valley.
Additionally, only smaller regional edge data centers can meet the demands of AI and the Internet of Things (IoT), which now process large volumes of data and require low latency and real-time operations.
The lack of available space and the high cost of data center rentals, particularly in urban areas, is one major reason why the edge may need to up its game and respond to the market, along with other reasons like power consumption and data latency rates.
And in the era of big data transfers, AI smartphones, smart devices, and innovation, we may need an extra ‘decentralized’ network surrounding central data services to act as a ‘glue’.
Techopedia talked to Steve Carlini, VP of Innovation and Data Center, Schneider Electric; Mark Troester, VP of Strategy at Progress; Mark Noland at Kingston Technologies, and John White, Chief Operating Officer of US Signal, to unravel what might be one of the largest transformations of global digital infrastructure.
Key Takeaways
- Currently, edge computing constitutes only 5% of AI, but estimates suggest this could rise to 50% in the coming years, driven by factors such as increased data center usage.
- Demand for low latency and real-time operations from AI and IoT may all contribute to a focus on edge.
- The transformation of edge computing involves new challenges such as managing data analysis, accommodating AI’s hardware requirements, and the increased use of IoT at home, manufacturing, and eventually, smart cities.
Data Center and AI-Power Consumption on a Massive Rise
Schneider Electric, a French multinational company specializing in digital automation and energy management, believes that AI computing happening on the edge will increase from 5% to 50%.
The company shared data with Techopedia on data centers’ AI workload performances that reveal massive spikes at key times. It also predicts the total power consumption of data centers will double from 2023 to 2028 — peaking from 57 GW to 93 GW.
In particular, AI power consumption is expected to skyrocket. Schneider Electric set AI power consumption in 2023 at 4.5 GW and expects it to reach anything between 14.0 to 18.7 GW by 2028. This represents an increase of 8% in 2023 and an increase of 15 to 20% over the next four years.
A Shift from Data Centers to Edge Computing
Schneider Electric says that at the moment, 95% of all AI workloads run on central data stations and only 5% on the edge. However, the company believes that by 2028, the workload will be evenly split at 50% and 50%.
Carlini explained that Schneider Electric initially projected that 85-90% of AI compute capacity would be at the edge by 2028 as the data gravity associated with multimodal AI (such as images and video) would necessitate placement closer to the user and the data.
“However, we have adjusted this projection to 50%-50% for 2028,” Carlini said. “These estimates are a combination of forecasts and contracts with large data center operators (where we have dedicated account coverage) correlated with IT equipment and semiconductor forecasts from various sources, including IDC, Gartner, IEA, and others.”
Cloud Versus Edge Operations
Noland from Kingston Technologies spoke about the data and cloud computing relationship and the upscaling of edge data centers.
“Data centers will continue to be the core of the cloud and house the repository or home base for the cloud, but the percentage of new growth for the cloud will have to happen on the edge.
“Edge servers are the local touch points for end users of video streaming services, gaming services, and AI-driven applications, as well as traditional web services,” Noland said.
“The number of edge servers required to provide low latency and high throughput to a huge number of end users will mean that there will have to be more storage on the edge overall than in the data center.
“Edge components might be different from data center components due to potentially different product durability required and different extreme environments (hot, cold) where edge computing is stationed.”
Do IoT or IIoT Pressure Global Infrastructures?
Techopedia asked experts whether the industrial sectors and their IoT devices deployed at large scales are putting more significant pressure on edge infrastructures when compared to the smart home IoT users.
Troester from Progress said that the difference lies in the power of edge devices. The more powerful the device is, the more it will drive edge data center transformation.
Troester also highlighted how Industrial IoT devices require massive scaling projects as they are usually deployed in big numbers.
These types of devices also require industrial-specific standards such as security, privacy, data formats, bandwidth, precision, and latency. These industry-specific IIoT demands inevitably shape the engineering of edge data centers.
Noland said there is a long road ahead regarding IoT hardware and software and how it accommodates meeting AI and new technology demands.
“It seems that we are in the dawn of smarter IoT devices as very few of these devices are, as of yet, paired with AI tools and functionality. It would seem to me that IOT/AI-driven assistance and security tools are just now being rolled out.”
However, Noland recognized some progress and named them in clear examples.
“Microsoft’s integration of Co-Pilot, Silicon designers like Intel, AMD, Samsung, and more are rolling NPUs (Neural Processing Units) into their latest chips.
“Everything from Wifi routers to network switches will be able to take advantage of this technology to increase security and ease of use (who knows, perhaps Siri will become useful someday!)”
White from Signal disagrees and thinks smart home IoT devices — while deployed at a smaller scale per user — ultimately add up and make up a big number. White explains that these types of IoT devices have higher adoption rates than those used for smart city projects or industrial businesses.
“For the consumer, the ROI of cameras, smart locks, security systems, and others, are here today. To the tech-savvy consumer, a DIY security system can be installed in a day and is cheaper than a traditional security system — it’s a no-brainer with significant upside to the end consumer and their family.”
White also explained the reasons why industrial IoT is behind in deployment.
“On the industrial or smart cities side, I’m finding that most are struggling to unhitch themselves from the technical debt of aging systems, budgets, and legacy applications to move towards utilization of IoT devices to enrich their customers’ experience, gain greater efficiency, or simply provide insights on their businesses.”
How the AI-Smartphone Revolution Will Affect the Edge
As the smartphone industry enters a deep state of competition fueled by the era of AI smartphones, data centers, edge providers, and other infrastructure engineering leaders are looking into how this will affect architectures.
Carlini explained that Schneider Electric is already engaging in direct relationships with IT equipment manufacturers, including servers, processors, network switches, and more. To meet the demands of the mobile industry’s future, they are collaborating, designing roadmaps, and adjusting their portfolio for new products and tech.
It’s All About the Data, Not the Hype
Troester said that the transformation of edge computing should be based on data analysis and not on deploying new powerful technology for the sake of new technology.
“While AI is hyped, organizations really need to first think about how to manage and extract value out of the data. Data is what fuels AI and while it is easy to get caught up in the latest AI trends, for most organizations, thinking practically about managing and analyzing the data is the first step.”
Troester advised leaders to consider the value AI can drive for their business and build up from there. “Then you can make informed decisions about where the data or analytics processing should occur — on the edge or in the cloud,” Troester said.
The Complex AI-IoT Relationship
Noland from Kingston explained that while the changes need to happen organically, he is certain of one thing — the AI revolution will require more CPU, GPU, NPU, NICs, memory, and storage. But Noland warned companies not to invest in any technology that is not fit for business.
“Nobody wants to jump out on a limb to spend money on unproven technology before it’s ready. When a killer app (or tech) is discovered that takes advantage of the new technology then users will flock to this new technology. IoT-AI is on the edge of barreling down on us in the next several years.”
Talking about specific technologies, Noland referred to the advances of silicon lithography and its power efficiency.
“The edge will have to grow substantially to meet the new use cases, and newer technologies can be rolled into the edge as they are developed; existing edge servers won’t have to be retired earlier than scheduled as they can be augmented as new edge technologies are implemented.”
White agreed: “As we look to Edge AI, the existing infrastructure doesn’t need to be thrown away; it just needs to evolve.
“Any new technology needs to be embraced in a similar way: Assess existing infrastructure; reduce technical debt (things that will slow you down); identify a problem; initiate a proof of concept; and then execute.”
Troester also agreed that the transformation will come down to each use case. Troester explained that this is why it is important for organizations to clearly identify the business’s goals and requirements.
“Once this is done, the architecture will fall into place — in terms of the best way to approach it, what the right mix of hardware, cloud, software, etc. There is not a one-size-fits-all approach that will work for everyone.”
How the Edge Will Grow
The hardware and software changes that the global edge computing infrastructure must make to accommodate for the near future and the rapid progress of innovation are undeniable and inevitable.
The big question is how the new edge will be built. Techopedia asked the panel whether they are looking to scale, turn to hardware virtualization, or invest in new equipment and software programming.
Carlini said that edge computing will see a boost from data called edge AI and the need for AI models close to the user.
“This will be a year of workflow automation where AI models at the edge will work to make existing systems and processes more efficient for many industries, such as transport, manufacturing, and medical.”
Carlini explained that different factors determine the generative AI model that needs to be deployed at the edge, including the scope, how fast a decision needs to be made, the accuracy of the output, and the output type.
“For example, if your AI model is operating an emergency vehicle routing, it will need to analyze high-definition videos and traffic patterns to perform real-time operations to clear the traffic,” Carlini added.
“This would probably require an IT network of high speed, networked, edge AI data centers that could process a large amount of data and deliver real-time decisions and predictions.”
Decentralized Edge Computing Networks Are the Future
As Carlini explains, the challenges and pressure the edge faces are numerous and important.
He said: “To support data-intensive and ultra-low latency applications such as high-definition streaming media, augmented reality (AR) / virtual reality (VR), autonomous vehicles, autonomous mining, and industrial 4.0, we must place compute and storage resources at the network edge.”
Carlini explained that it is not only edge computing that is necessary and spoke about the use of faster network communications between the edge data center and the edge devices.
A challenge today is our global networks, which must provide high bandwidth, security, privacy, and reliable services capable of scaling up and down.
Carlini spoke about distributed network edge data centers that will be deployed at scale to support new data-intensive and ultra-low latency applications. “They will come in different sizes and configurations and will operate in a wide range of environments,” Carlini explained.
These new edge centers must be highly available and properly maintained to avoid unexpected power outages and other forms of unscheduled downtime that would disrupt critical applications. Additionally, they will have to consider carbon emissions and energy efficiencies.
Regarding specifics, Carlini recommends that companies use monitorable power gear with lithium-ion battery backups, air or liquid cooling according to density, environment, and space constraints, and robust IT enclosures designed for harsh environments.
As a resource, Schneider Electric offers a guide to deploying resilient data centers at the network edge highlighting resilient and sustainable prefabricated modular solutions, cybersecurity practices implementations, monitoring software management programs, and the importance of leveraging an ecosystem of partners for a complete solution.
Noland added that decentralized edge computing, which breaks a central bottleneck existing in our current infrastructure, must modernize components.
“Older, slower components of the systems can cause bottlenecks, but faster system memory and flash storage can ensure that you are getting the most performance out of Edge Compute CPU & GPU to service the end users,” Noland said.
The Bottom Line
The edge is no stranger to transformation. In the early 1990s, it made the World Wide Web possible, reducing downtime and latency by bringing data such as images, searchers, HTML, and text closer to global users. But fast-loading a webpage from the 90s is nothing like what the edge does today or what it will need to do in the future.
Today, the edge has to bring closer to the user resources that are incredibly workload-heavy. This means eliminating the latency from the central core cloud data centers to IoT devices. It is far from being an easy feat.
Ensuring reliability and continuity, reducing latency, and avoiding disruptions and downtime, the role and importance of edge computing in our current digital world can never be underestimated or taken for granted.
Bringing data closer to the user may not be a new concept. What is new is the amount of data that flows in our world. Engineers and developers must make sure they build the edge infrastructure needed for today and the future as innovation races downtime.
References
- Steven Carlini’s LinkedIn profile (LinkedIn)
- Schneider Electric Official Website (Schneider Electric)
- Mark Troester’s LinkedIn profile (LinkedIn)
- Progress Official Website (Progress)
- Mark Noland’s LinkedIn profile (LinkedIn)
- Kingston Technology Official Website (Kingston Technology)
- John White’s LinkedIn profile (LinkedIn)
- US Signal Official Website (US Signal)
- Deploy resilient data centers at the network edge (Schneider Electric)