Green AI: How to Make AI Sustainable & Eco-Friendly?

Why Trust Techopedia

Digital transformation was supposed to herald a cleaner, greener world. Then, generative AI (GenAI) arrived, as greedy for power as an entire country – raising urgent concerns about the AI environmental impact. Today, the data economy’s apex app is driving the growth of giant-sized data centers and stressing local electricity grids.

Can natural power sources like wind and solar meet AI demand? Probably not. Experts say a sustainable AI is possible, but we’ll need to change how large language models (LLMs) are trained and the way applications are coded.

Without these interventions, artificial intelligence (AI) and the renewables transition are on a collision course. We consider how to avoid a crash.

Key Takeaways

  • Concerns about GenAI’s massive power consumption aren’t new, but the standard answers – more renewable power plants and upgraded electricity distribution networks – will take years to happen.
  • Data centers are already rationing computing capacity, and grids are experiencing AI-induced blackouts.
  • In the short term, fossil fuel generation is easy to switch on. But that puts AI on a collision course with the push for net zero.
  • New developments in sustainable coding and AI model development could provide a short-term fix.

AI & Renewables Are Headed for a Clash

ChatGPT and the GenAI models that followed it shocked the world with their computational power. Their power consumption is just as hair-raising.

A recent report by the US Department of Energy estimates that by 2028, AI’s annual electricity usage will grow to between 160 and 320 terawatt-hours (TWh). That’s more than all the electricity burned up by US data centers across the board.

For context, 320 TWh could power 20% of all American households for a year, while generating it could spew out as much carbon as 300 billion miles of driving – roughly equal to 1,600 round trips between the Earth and the Sun.

Can wind, solar, and hydro generate enough electricity to meet surging AI demand?

AI data centers operate 24/7-365, so they need constant power. Intermittent technologies like wind and solar power, which depend on the weather to run, can’t meet all the demand, so data centers often default to carbon-intensive generation.

How Green Are Those Electrons?

A 2024 Harvard study found that data center dependence on fossil fuel generation is almost 50% higher than the US average. America’s data centers also tend to be located in places like West Virginia and Pennsylvania that have more carbon-intensive grids.

Keen to protect their sustainable credentials, Big Tech firms like Amazon, Google, and Meta have responded by investing in nuclear, even making a joint pledge to help triple the world’s atomic energy capacity by 2050.

But what about today? After years of being sidelined over safety concerns, nuclear energy now amounts to just 20% of US power generation and barely a fraction of the electricity used by AI. In Virginia, where many of the largest hyperscale data centers are clustered, natural gas accounts for more than half of the electricity on local grids.

Replacing it with nuclear power will take decades. The arrival of small modular reactors and the Trump Administration’s recent moves to speed up nuclear plant construction could help, but the AI power crunch is happening now.

Why Does AI Need So Much Power?

Large language models are like supercharged engines constantly revving under GenAI’s hood. In any given hour, they process tens of millions of inferences across different data models, and model size directly impacts their energy demand.

After a certain point, LLMs need more chips to run, and each new chip needs more energy. Smaller models like LLaMA have around 65 billion parameters, while Mistral has around 24 billion. DeepSeek’s R1 model uses 670 billion parameters and GPT-4 is estimated to have over 1 trillion.

That’s a massive library of data points, and processing them at speed leaves GenAI with an unquenchable thirst for fuel. Today’s power grids are struggling to cope.

Last March, UK National Grid CEO John Pettigrew told a utilities industry forum that data center electricity demand is expected to surge by 500% over the next ten years.

Noting serious structural constraints, including an out-of-date distribution network, he warned that “future growth in artificial intelligence, quantum computing, and other groundbreaking technologies will require computing infrastructure that’s even more energy intensive than today’s.”

Hyperscale data centers are already feeling the pinch. With racks and racks of AI servers spread across millions of square feet, they’re filling up fast with heavy-duty GPUs and ASICs that need more power to process more AI inferences – and additional power to keep everything cool and stable

The world’s 11,000+ data centers will soon be joined by hundreds more. It’s an issue that won’t solve itself.

Structural Changes Will Take Time

The UK generated enough renewable energy last year to power every house in the country, so why can’t data centers simply request more green electricity from their energy suppliers?

Things get thorny when utilities try to link green generation to specific points of need. Some, in fact, most grids aren’t optimized for this kind of granular demand-response. Intermittency can mean that not enough clean energy is being produced, and sometimes it means too much; therefore, there also needs to be a storage infrastructure to ensure excess clean energy can be stored for peak periods.

Upgrading electricity distribution systems, adding utility-scale battery storage, and bringing more nuclear capacity into the generation mix could all help, but energy infrastructure buildouts take time. What happens now when surging AI electricity demand creates a bottleneck today?

One option hyperscalers like AWS have resorted to is rationing computing capacity, which in practical terms means occasionally favoring one customer’s computing needs over another’s.

The easiest answer is to keep burning fossil fuels. For all their problems, coal, oil, and gas-burning plants are up and running, reliable, and located near the places where data centers tend to cluster.

Defaulting to dirty energy could drain interest from the long list of new renewable-energy and green tech projects waiting for regulatory approval. Meanwhile, AI is rapidly expanding from corporate pilots to full production and being embedded in every smartphone and cloud service.

A New Model

While utilities and clean energy projects work to improve electricity distribution, generative AI companies with Green AI initiatives are focusing on ways to reduce the carbon footprint of their LLMs.

One issue is misusing the scale of big, generalized data models when end users only need to perform specialized tasks.

Asking an LLM with 1 trillion data parameters to tell you the best recipe for banana bread is a waste of resources. Why use a bulldozer when a garden shovel will do the trick?

An analysis by Microsoft researcher Reshmi Ghosh discovered that LLMs don’t need to pore through every data point when processing a request, and could focus instead on the data needed to execute a given task.

That could give rise to smaller models designed to address specific use cases. In computing terms, they wouldn’t need to work so hard, could eliminate a lot of redundant effort, and would draw less power as a result.

Microsoft is already encouraging companies to experiment with small language models (SLMs) through its open Phi-3 suite of AI-ready data sets.

At around five billion parameters, SLMs can be applied to discrete tasks and are easier to train, needing less continuous looping of specific operations.

Trends in sustainable software development promise other ways to make AI models more energy-efficient.

Monitoring and optimizing AI models throughout the development lifecycle could reduce the carbon footprint of AI applications. Emerging practices like lean coding could trim away overly complex functions and flawed loop structures that make computing operations more energy-intensive.

MLOps teams could also adapt their model training schedules to run during off-peak hours, when energy prices tend to be lower, or opt for distributed training, spreading workloads across multiple machines to minimize bottlenecks and processor overheating.

And AI might be able to fix itself. Google started using DeepMind to analyze data center energy efficiency back in 2016, reportedly uncovering ways to reduce energy consumption by up to 40%, particularly for cooling systems.

Microsoft has been looking at other natural ways to keep data centers temperature controlled. Its Natick project places server units in seawater to save power on cooling.

The Bottom Line

Is an eco-friendly AI possible? Experts think LLMs can be made more energy efficient by becoming more parameter-efficient. There are also sustainable coding techniques that make software run more efficiently and reduce the carbon footprint during model training.

AI is already forcing businesses and governments to grapple with painful copyright, legal, and economic questions. Experts believe it’s now time to factor in the philosophical. The ongoing debate about GenAI’s net impact on humanity has entered the environmental realm. Efforts at multiple levels will be needed to address them.

FAQs

How much energy does generative AI consume?

Can renewable energy sources keep up with AI demand?

What are the environmental risks of large AI models?

How can AI be made more energy-efficient?

What is the “third way” between green energy and AI growth?

Is AI bad for the environment?

Related Reading

Related Terms

Advertisements
Mark de Wolf
Technology Journalist
Mark de Wolf
Technology Journalist

Mark is a tech journalist specializing in AI, FinTech, CleanTech, and Cybersecurity. He graduated with honors from the Ryerson School of Journalism and studied under senior editors from The New York Times, BBC, and The Globe and Mail. His work has appeared in Esports Insider, Energy Central, Autodesk Redshift, and Benzinga.

Advertisements