AI is a carbon-intensive technology. While many companies are committing towards Net Zero emissions, the growth in AI use threatens to throw a spanner in the works, as investment in carbon-intensive models and infrastructure continues.
A new report released by Google this week revealed that the tech giant’s greenhouse gas emissions increased by 48% in five years [PDF] due to the demand for AI energy. The tech giant noted that this was primarily due to increases in data center energy consumption and supply chain emissions.
The report also warned: “As we further integrate AI into our products, reducing emissions may be challenging due to increasing energy demands from the greater intensity of AI compute, and the emissions associated with the expected increases in our technical infrastructure investment.”
Google’s comments are an admission that as it works on generative AI-driven products like Gemini and Veo, its green initiatives are ultimately going to take a back seat to innovation.
Key Takeaways
- AI technology is highly energy-intensive, significantly increasing carbon emissions, especially as AI models grow more complex.
- Google’s greenhouse gas emissions rose by 48% over five years due to AI’s energy demands, primarily from data center operations and supply chain emissions.
- Data centers, accounting for 1-1.5% of global electricity use, contribute to AI’s carbon footprint through both operational energy and cooling systems.
- Reducing AI’s carbon impact involves adopting more efficient algorithms, decentralized data centers, and alternative cooling methods to decrease overall energy consumption.
Why is AI so Carbon Intensive?
At its heart, AI is an energy-intensive technology. The very process of training and developing machine learning algorithms and models leads to high energy consumption.
Even completing a simple task like generating an image with a model can use as much energy as it takes to charge your phone. To make matters worse, as models become bigger over time with more parameters, the overall energy consumption will increase.
David Craig, CEO of Iceotope, told Techopedia:
“AI has already had a substantial impact on carbon consumption. The growing demand for AI applications has led to an exponential increase in the need for computational power. This increase translates to higher energy consumption in data centers housing the servers running AI algorithms.”
Models must be trained on large datasets containing millions of words (and in some cases images), the models must then be deployed to servers, equipped with multiple graphic processing units (GPUs) to conduct computational inference.
These servers are also often located in larger data centers, which themselves account for 1-1.5% of global electricity use. Servers in the data center not only consume power and produce carbon when performing computational inference, but also when they’re cooled down via a cooling system.
“A significant part of this energy consumption comes from the cooling systems required to keep these high-performance servers operational. The carbon footprint of AI is a product of both the electricity required to power the servers and the energy needed to cool them,” Craig said.
Not All Energy is Created Equally
It is important to note that the impact of AI and data centers also depends on the type of energy that a data center uses. A data center that uses nuclear power, hydro power, solar power, or wind power to power its infrastructure will generate less of an environmental impact than one that uses fossil fuels.
In any case, it’s also worth highlighting that measuring the true carbon impact of AI is difficult, because there’s no universal standard for measuring carbon dioxide emissions in AI workload.
This is amplified by the fact that big AI vendors like Google and OpenAI use blackbox development approaches, offering limited transparency over how models are trained and developed.
These factors together make it extremely difficult to measure the overall impact of AI on the environment and carbon production, but if we take into account Google’s report, it would suggest that investment in AI and more specifically data center infrastructure, is likely to increase carbon production among other organizations investing in AI development.
How the Impact of AI Can be Reduced
While the carbon production of AI is significant, the good news is that there are ways that it can be decreased. Beyond just simply developing more computationally efficient algorithms, one way to do this is to reevaluate the use of air-cooling systems, which consume a significant amount of energy in the process.
Craig added: “Reducing the carbon footprint of AI involves addressing both the energy required for computation and the cooling processes. Traditional air-cooling systems, which have long been the standard in data centers, are continuously proving inefficient, especially as computational demands grow.
“These systems often struggle to effectively dissipate the heat generated by densely packed servers, leading to higher energy consumption and greater environmental impact.”
Dorian Shimy, CEO of FutureFund, argues that processing AI across multiple decentralized data centers can reduce overall energy consumption.
“Beyond just enhancing algorithmic efficiency or transitioning to renewable energy sources, we are seeing the emergence of decentralized AI processing
“This approach distributes the computational load across multiple, often smaller, and more efficient data centers that are closer to the source of data generation.
“This not only reduces the energy lost in data transmission but also allows for the use of local green energy sources that are otherwise not feasible on a larger scale.
“Such strategies represent a radical rethinking of traditional data center operations, promising substantial reductions in AI’s carbon footprint.”
The Bottom Line
Lowering the impact of AI on carbon development is tough, and in the midst of an AI arms race, going green is often going to lose out to innovation and profit.
That being said, companies that want to develop AI responsibly will be able to do so by using computational efficient models and switching to sustainable data centers.