AI Has a Huge Climate Change Problem

When it comes to the climate crisis, AI is a part of the problem. But it can also be part of the solution. Take a closer look at the environmental costs of AI, its potential upsides and how it can contribute to a greener future.

Written by Ellen Glover
A 3D rendering of a data center with servers and a green filter on top
Image: Shutterstock | Built In
UPDATED BY
Matthew Urwin | Nov 27, 2024

As the world continues to grapple with the increasingly urgent issue of climate change, artificial intelligence appears to be a promising solution. In just the last few years, AI has been used to model more accurate climate predictions, optimize energy efficiency in buildings and monitor things like deforestation and ocean health through satellite imagery. 

How Does AI Affect the Environment?

Artificial intelligence has been known to have a negative effect on the climate since it relies on data centers, which produce carbon emissions and consume large quantities of water. The process of manufacturing AI tools also leads to e-waste and uses up raw materials, further harming the environment.

But AI’s relevance as a climate-change-fighting tool comes at a time of increasing ethical concerns, and the technology may actually be worsening the problem. The training of complex deep learning models requires a substantial amount of computational power, often leading to a significant carbon and water footprint. And the rapid growth of the AI industry has led to an unsustainable rise in demand for hardware and the raw materials used to make it, which takes a massive toll on the planet’s air and soil quality.

The duality of AI’s role in the climate crisis has prompted researchers, companies and users alike to reckon with the industry’s ecological consequences and evaluate its place in our pursuit of a greener future.

More on AI Ethics4 Ethical Questions Generative AI Poses

 

How AI and data centers impact climate change. | Video: CBS Mornings

The Environmental Cost of AI

The artificial intelligence industry is what scholar Kate Crawford calls an “extractive” business. In her book Atlas of AI, Crawford notes that rare earth minerals like mercury and zinc must be mined in large quantities to build the machines used to train and run AI models. Coal and oil must be drawn from the earth to power the data centers that provide the necessary computational power and resources for AI algorithms to work.

All of this comes at a cost to the environment. Not only in the form of soil and air pollution, but also carbon emissions and water use. 

AI’s Carbon Footprint

“Minerals are the backbone of AI, but its lifeblood is still electrical energy,” Crawford wrote. “In reality, it takes a gargantuan amount of energy to run the computational infrastructures of Amazon Web Services or Microsoft’s Azure, and the carbon footprint of the AI systems that run on those platforms is growing.” 

Tech companies have learned this lesson the hard way. In its 2024 environmental report, Google revealed that its carbon emissions increased 48 percent between 2019 and 2023. Meanwhile, Microsoft shared in its 2024 environmental sustainability report that its emissions rose over 29 percent between 2020 and 2023. Both blame these trends on data centers, which have become vital to supporting the companies’ investments in AI.    

In addition, a 2022 research paper published by AI startup Hugging Face found that making its language model BLOOM resulted in more than 50 metric tons of carbon dioxide emissions, roughly the equivalent of 60 flights from London to New York. And that’s significantly less than the emissions associated with other language models.

“You need huge amounts of data,” Kasper Groes Albin Ludvigsen, a data scientist and green AI advocate, told Built In. “The larger the models are, the more energy-intensive they are.”

And according to Ludvigsen, using an AI system actually emits much more carbon dioxide than training one. Researchers at UC Berkeley and Google found that it took 1,287 megawatt hours of electricity to train OpenAI’s GPT-3 model, which is enough energy to supply an average United States household for about 120 years. Back when ChatGPT was still using GPT-3, the chatbot received millions of daily queries, equating to what Ludvigsen estimates to be about 4,000 to 6,000 megawatt hours of electricity — three to five times what it took to train GPT-3 in the first place.

“It’s not just about one company and their one model. Many companies are developing these types of models,” Ludvigsen said, adding that, as AI adoption increases, so too will the environmental costs of maintaining it. 

AI’s Water Footprint

Training AI also consumes a lot of water, which is pumped in and out of data centers to prevent servers from overheating. One 2023 study found that training GPT-3 in Microsoft’s U.S. data centers can directly consume 700,000 liters of fresh water a day. And Google’s U.S. data centers consumed 12.7 billion liters of freshwater for on-site cooling in a single year, 90 percent of which was potable water.

Water is also necessary in actually producing graphics processing units (GPUs), which are used in the AI training process. According to Shaolei Ren, an associate professor of electrical and computer engineering at University of California Riverside, who co-authored the 2023 paper, a typical semiconductor factory will consume millions of gallons of water a day, and that water needs to be ultra-pure because it is used for cleaning.

AI’s massive consumption of water is a problem in places where water is a highly limited resource. This is not only true in the making of AI, but also in its daily use. For instance, OpenAI deploys servers all over the world — some of which are in regions where water is scarce — to enable the rapid responses of ChatGPT. This continues to cause “tension” with local water users in these areas, Ren told Built In. “This is the reason why we have to pay attention to AI’s water footprint.”

Related Reading14 Risks and Dangers of Artificial Intelligence (AI)

 

AI as a Solution

Still, the use of AI can also do a lot of good. During a 2024 special meeting, the United Nations’ Economic and Social Council explored how AI is helping nations achieve the UN’s sustainable development goals. The council found that AI has already been used to maintain circular economies, track greenhouse gas emissions and help farmers better manage their resources, among other real-life applications.  

In addition, the nonprofit Climate Change AI published a report in 2022 explicitly outlining some of the ways artificial intelligence can be used to address issues related to climate change. From optimizing power grids and enabling smart buildings, to enhancing transportation systems and facilitating precision agriculture, AI has proven to be a powerful tool in helping reduce global greenhouse gas emissions and foster energy efficiency.

And AI’s predictive capabilities can be used in everything from climate modeling to the development of mitigation policies — allowing scientists to ask “what if” questions, and giving policymakers the knowledge they need to weigh the costs and benefits of a particular strategy. 

“AI is really good in those cases where you need to do something at a larger scale, or at a faster time granularity, using a lot of data,” Priya Donti, co-founder and executive director of Climate Change AI, told Built In. And while it won’t solve every single problem related to climate change, “it is a tool that, alongside engineering, the social sciences and policy, needs to be part of the mix.”

But this begs the question: Do the environmental benefits of AI outweigh the accompanying risks? Yes, artificial intelligence is being used to track and mitigate climate change, but it is also being used to accelerate oil and gas extraction. And while implementing AI can certainly save organizations time and money, the ends may not always justify the means.

More on Green AIWhat Are Robot Bees?

 

The Path to a Greener Future

Work is being done to reduce AI’s carbon and water footprints, allowing it to be a more sustainable tool in the fight against climate change.

Reducing Data Centers’ Carbon Footprint 

Tech giants like Google and Microsoft have pledged to power all their operations, including data centers, with renewable energy within the next decade or so. And some data centers have started combining their normal power sources with renewable energy sources like solar panels, and have scheduled accordingly so that they are operating at optimal times — like when the sun is highest in the sky (research shows this to be an effective method of reducing carbon emissions).

UC Riverside professor Ren said a similar tactic can be used to reduce water consumption. If a data center runs mid-day, when temperatures are at their highest, water efficiency is low because more is required to keep things cool. And a data center in Arizona is much less water efficient than one in Washington State, because of the differences in climate. Companies can use these temporal and spatial differences to foster more efficient water use. For example, training can be done at night during cooler temperatures or in Canada instead of Arizona. 

Producing More Energy-Efficient AI Models 

AI engineers and researchers are also making headway in creating more environmentally friendly models, like transformers, which can process more data in less time, thereby reducing energy consumption. Research has found that tweaking the settings of the cloud service an algorithm runs on during training can actually make the process much more energy efficient.

There’s also a growing movement to design more robust models with less data, and therefore less energy, in a technique known as frugal AI. This may involve using less data and fewer features to make predictions, or limiting the amount of computational power or memory resources used in training and deploying a model

Promoting Transparency Around AI’s Climate Impact

The real linchpin of a more energy-efficient AI industry will be accurate measurements and transparency, said AI expert Bernard Marr in a Forbes op-ed. Companies like Salesforce and Microsoft have rolled out their own tools to help businesses visualize and understand their own carbon footprints as a whole; and the Machine Learning Emissions Calculator can help businesses estimate the carbon footprints of their AI models specifically, looking at factors like cloud provider, geographic region and hardware used. 

“We need to really shape and steer the development of AI in ways that are fundamentally aligned with what we need for climate change-related problems, rather than climate change-related problems being an afterthought,” Donti said. “We can’t just develop AI for business as usual use cases and then hope that the climate change space benefits from it.”

Frequently Asked Questions

To operate, artificial intelligence depends on massive data centers, which consume large amounts of water and natural resources while producing steady carbon emissions. As a result, AI can worsen the impacts of climate change if not used responsibly.

Yes, AI is harmful to the environment. Training and using artificial intelligence produces carbon emissions and consumes a lot of water. And manufacturing AI hardware requires raw materials, which erodes the planet’s air and water quality and generates e-waste.

The carbon footprint of AI varies depending on the company developing it, with major tech brands like Google and Microsoft often generating significant amounts of carbon emissions. Tools like the Machine Learning Emissions Calculator can help businesses determine the size of their carbon footprint when using AI technologies.

Explore Job Matches.