Last year’s Dell Technologies World event set a high bar for AI innovation. The introduction of Dell AI Factory and the early integration of AI into Dell PCs were seen as a bold move by the Texas-based company to make AI more accessible across enterprise environments.
But 2025 marks a more decisive turn. Dell is doubling down on its vision, calling 2025 a “breakthrough year” as long-running AI projects move from concept to execution.
At this year’s Dell Technologies World conference held at the Venetian Convention & Expo Center in Las Vegas from May 19 to 22, the American PC maker showcased its latest push to operationalize the company’s AI vision across infrastructure, devices, and services.
From new AI Factory partners to next-generation servers, high-performance AI PCs, and updated data platforms, Techopedia tracked the key reveals that hint at where Dell is placing its bets and what might come next.
Key Takeaways
- Dell Technologies World 2025 saw the company announce a slew of enterprise-level AI innovations across infrastructure, PCs, and data platforms.
- New PowerEdge servers using Nvidia’s HGX B300 and AMD’s MI350 GPUs were introduced to support high-performance AI workloads and LLM training.
- Dell introduced Project Lightning, a high-speed file system built for enterprise AI workloads.
- PowerCool, a new liquid cooling solution, promises up to 60% savings on data center cooling costs.
- The Pro Max Plus AI PC runs Qualcomm’s AI-100 NPU and Intel Arrow Lake HX, built for AI pros who need to run models locally.
The Biggest Reveals at the Dell Technologies World 2025
This year’s event kicked off with a vision for the AI-powered enterprise. The company founder and CEO, Michael Dell, described Dell’s latest AI strides as the “infrastructure backbone” for enterprises leveraging data to drive intelligent solutions.
Here are the key products from the event:
1. PowerEdge Servers with Nvidia HGX B300
Dell used the opening day of the event to introduce the newest versions of its PowerEdge servers, the XE9780 and XE9785. According to the PC maker, the servers feature Nvidia’s HGX B300 platform and will support up to 256 Blackwell Ultra GPUs.
The two servers are engineered to deliver maximum performance for AI and high-performance computing (HPC) workloads. They come in both air-cooled and liquid-cooled versions, which offer flexible deployment based on thermal and spatial constraints.
The air-cooled variant is a 10U system that fits standard 19-inch racks and supports both Intel Xeon 6 and AMD EPYC 5 processors. This configuration supports 1.8 GB/s of GPU-to-GPU throughput. Each server also includes eight ConnectX 800 Gbps network interfaces that allow for clustering multiple GPU servers into a massive, unified AI training system.
Meanwhile, the liquid-cooled version is a compact 3U design capable of housing up to 256 NVIDIA HGX B300 GPUs per rack, Dell’s highest GPU density ever.
The servers are purpose-built to meet the demands of generative AI, LLMs, and real-time inferencing, which forms Dell’s backbone of next-generation AI infrastructure.
2. PowerEdge Servers with AMD GPUs
Dell also brought its PowerEdge XE9785 and XE9785L servers to life during the event. The two servers are now designed to house AMD’s Instinct MI350 Series GPUs, with each equipped with 288GB of ultra-fast HBM3E memory.
Dell states these servers offer up to 35 times greater inferencing performance than the previous iterations, and are suitable for large language models and real-time data analysis.
The XE9785 and XE9785L servers also have both air-cooled and liquid-cooled variants. The two servers will eventually be used to power Dell’s AI Platform with AMD, which currently runs on XE9680s, upgraded in time for this year’s Dell Tech World.
According to Varun Chharba, senior vice president of infrastructure and telecom marketing at Dell, this newest update will help improve customer time to value by up to 86% by offering a turnkey AI infrastructure that’s more scalable, efficient, and optimized for modern enterprise AI workloads.
3. Dell AI Data Platform – Project Lightning
The Dell AI Data Platform is a unified system designed to streamline enterprise-scale AI workflows. A new enhancement to the platform announced at this year’s conference is Project Lightning. Dell says its release is due later this year and will become the world’s fastest parallel file system.
The platform also integrates vector search capabilities directly within the Dell Data Lakehouse. This allows for immediate data set creation and fast querying, crucial for AI model training and retrieval-augmented generation (RAG) applications.
Dell also built query functions around SQL-accessible LLM functions, sentiment analysis, and text summarization, with all of it baked into a customer’s instance of Dell’s AI Data Platform.
4. Dell PowerCool Enclosed Rear Door Heat Exchanger
The PowerCool Enclosed Rear Door Heat Exchanger (eRDHx) is a cooling solution designed by Dell for AI-intensive data centers. Dell says this is another industry first, and it uses advanced liquid-cooling technology to manage the high thermal output of AI servers to ensure optimal performance and energy efficiency.
Dell revealed that organizations can achieve up to 60% savings in cooling costs using eRDHx compared to conventional solutions.
While this could be viewed as another marketing claim, the desktop computer maker affirmed that the new solution enables a 16% increase in AI and HPC rack density without requiring additional power, and this can allow enterprises to scale performance efficiently within existing energy constraints.
The system also supports racks up to 80 kilowatts and includes hot-swappable fans, making it both powerful and serviceable.
5. Dell Pro Max Plus AI PC with Qualcomm Chips
With nearly 1.5 billion aging PC devices in circulation and the end of Windows 10 support fast approaching, Dell is positioning itself at the forefront of the next PC refresh cycle.
The new Dell Pro Max Plus AI PC combines Qualcomm’s AI-100 NPU with Intel’s Arrow Lake HX processors, featuring 32 dedicated AI cores
Dell’s SVP of Product Marketing stated that this is the first time an enterprise-grade discrete NPU has been integrated into a mobile form factor. Professionals working in AI-intensive fields may find the new AI PC very useful.
Dell confirmed it swapped out Nvidia chips for Qualcomm to maximize inferencing power at the edge. The device is built for AI engineers and data scientists who need to train and deploy models locally, without the need to connect to the cloud.
With the AI-100 NPU, users can run models with up to 109 billion parameters directly on the laptop, enabling use cases like deploying Meta’s Llama 4 language model on-device.
The Bottom Line
2024 laid the groundwork for Dell’s AI Factory vision, but at Dell Technologies World 2025, the company signaled it is now moving beyond vision and into execution.
Based on what Techopedia observed, the announcements reflected a push toward turning early AI concepts into deployable enterprise solutions. Dell’s collaborations with Nvidia, AMD, and Qualcomm appear to support this effort, though how effectively they translate into real-world impact remains to be seen.
As with any major product reveal, bold performance claims were made. It will take time and perhaps, independent validation to determine how much of it holds up outside the showroom.