Be Strategic in your AI Spending — Don’t Waste Your Money, Experts Warn

Why Trust Techopedia

The tech industry is still abuzz with the rapid advancements in artificial intelligence (AI), and it’s not just the technology itself that’s making waves.

According to the latest data from International Data Corporation (IDC), spending on cloud infrastructure for AI-centric workloads is skyrocketing, with an expected 20% growth across 2024.

This surge in spending is driven by the increasing demand for specialized hardware as enterprises and hyperscalers race to harness the power of AI.

While the influx of investment is undoubtedly fueling innovation, it also raises questions about the long-term sustainability and strategic alignment of these expenditures. Are companies simply chasing the latest trend, or are these investments truly future-proofing their organizations?

In this article, we discuss the implications of this spending spree and how businesses can approach their AI spending to cut costs.

Key Takeaways

  • ​​While the huge investment in AI is driving innovation, experts warn against blindly following AI trends without a strategic, long-term focus on value creation.
  • Companies must understand their realistic, viable use cases for deploying AI applications and ensure access to clean, quality data to avoid exorbitant computing costs.
  • To cut wasteful AI spending, businesses should prioritize initiatives offering higher ROI, utilize cloud cost management tools, leverage open-source components, and explore pricing models beyond traditional pay-as-you-go.
  • For smaller firms, using available AI models via API on a pay-per-use basis may be more cost-effective than maintaining heavy models on their own cloud, easing cost prediction and update management.

AI Hardware is Shaping Cloud Infrastructure Expenditure

The cloud computing industry is experiencing explosive growth, and while this trend is not new, enterprises’ desire for AI fuels the fire.


And what is the kindling for the fire? Powerful AI hardware.

At the heart of this are advanced Graphical Processing Units (GPUs), which have become the go-to hardware for training and running AI models. The GPU market has witnessed remarkable growth, with key players like Nvidia, Intel, and AMD leading the charge. Nvidia has seen its stock price skyrocket in the past year, going from around $200 to $853 per share at the time of writing.

This rise is a testament to the insatiable demand for AI chips, which are used by AI chatbots like ChatGPT and major tech firms such as Meta, Microsoft, and Amazon.

However, industry experts caution that the hype around AI hardware may be outpacing the reality. In a recent report by MarketPlace, Matt Bryson, senior vice president of research at Wedbush Securities, suggests that the $400 billion valuation for the AI chip market by 2027 may be difficult to validate.

He likens the current AI boom to the 4G wireless revolution, where the true impact and applications were not fully realized until the launch of transformative apps like Uber and Instagram.

Bryson’s sentiments echo the need for companies to approach their AI infrastructure investments with a balanced and strategic mindset, focusing on long-term value creation rather than chasing the latest hype.

Consider Real AI Use Cases Before Spending

While businesses rush to take advantage of the AI boom, Colleen Tartow, Field CTO and Head of Strategy at VAST Data, warns that simply getting on GPU waitlists is not enough.

“Organizations must focus on the realistic, viable use cases behind their desire to ‘do AI’.”

Speaking to Techopedia, Tartow emphasized that businesses need to understand the full complexity involved in deploying production-quality AI applications.

She said:

“To innovate and extract value, AI needs access to vast amounts of data, coming from all different sources, across different geographies and hybrid environments. Limiting data sprawl and ensuring that models are being trained on clean data is paramount to avoid astronomical computing costs.”

The above sentiment aligned with Arthur Delerue, founder and CEO at, who discussed cloud and AI investment challenges with Techopedia.

He said:

“We use generative AI in our product, and we are extremely careful about the infrastructure costs.

“It is tempting to go ‘all-in’ in generative AI as it is easy to come up with impressive proofs of concepts.

“But then scaling up an AI-based application from a couple of requests to millions of requests can be extremely challenging from a cost standpoint.

“So our advice is this: do the math at the very beginning of your project in order to understand if your AI-based product can be profitable.”

How Can Businesses Strategize Their AI Spending to Stop Waste?

To cut waste on AI spending, Alok Shankar, Software Development Manager at Oracle, calls on businesses to place higher premium on AI initiatives that can yield higher returns on investment (ROI).

He told Techopedia:

“Businesses can prioritize initiatives that offer higher ROI. There should be clear focus on business needs and metrics definitions to measure success. Cloud cost management tools and strategies that use cheaper spot instances for less power-hungry applications can be helpful. Using open-source tools and components can also be useful for cost-saving measures.”

Shankar also noted the need for businesses to explore pricing methods beyond the traditional pricing models.

“Cloud vendors are usually open to negotiating prices for larger customers with compute-intensive workloads. However, newer cloud pricing models beyond traditional pay-as-you-go structures could be beneficial.

“Typically reserved instances work well for such needs. Edge AI cloud infrastructures that do not move a lot of data can be priced differently and cheaper,” he said.

Chief AI Officer at Comply Control, Mikhail Dunaev, chides businesses against skimping on servers for AI.

He told Techopedia:

“In today’s technological race, being current is critical, and falling behind is all too easy. You should focus all your efforts on development, anticipating that cloud prices for generative AI will gradually decrease.

“For smaller companies, it’s often more practical to use available models via API rather than maintaining heavy models on their own clouds. Here, you pay for the use of AI resources (pay-per-resource) rather than for powerful servers. This approach makes cost prediction easier and eliminates concerns about updates.”

The Bottom Line

While heavy investments in AI-driven cloud infrastructure have fueled innovation and propelled AI capabilities, it has also pointed out the need for more strategic and sustainable approaches to AI investments.

It’s important to avoid following trends blindly or making hasty decisions driven by fear of missing out. Instead, emphasize the importance of aligning investments with long-term business objectives, carefully evaluating the true return on investment.

This will require a critical assessment of AI needs, identifying areas of overspending and potential cost savings.


Related Reading

Related Terms

Franklin Okeke
Technology Journalist
Franklin Okeke
Technology Journalist

Franklin Okeke is an author and tech journalist with over seven years of IT experience. Coming from a software development background, his writing spans cybersecurity, AI, cloud computing, IoT, and software development. In addition to pursuing a Master's degree in Cybersecurity & Human Factors from Bournemouth University, Franklin has two published books and four academic papers to his name. His writing has been featured in tech publications such as TechRepublic, The Register, Computing, TechInformed, Moonlock and other top technology publications. When he is not reading or writing, Franklin trains at a boxing gym and plays the piano.