As the world continues to grapple with the increasingly urgent issue of climate change, artificial intelligence appears to be a promising solution. In just the last few years, AI has been used to model more accurate climate predictions, optimize energy efficiency in buildings and monitor things like deforestation and ocean health through satellite imagery.
But AI’s relevance as a climate-change-fighting tool comes at a time of increasing ethical concerns, and the technology may actually be worsening the problem. The training of complex deep learning models requires a substantial amount of computational power, often leading to a significant carbon and water footprint. And the rapid growth of the AI industry has led to an unsustainable rise in demand for hardware and the raw materials used to make it, which takes a massive toll on the planet’s air and soil quality.
The duality of AI’s role in the climate crisis has prompted researchers, companies and users alike to reckon with the industry’s ecological consequences and evaluate its place in our pursuit of a greener future.
The Environmental Cost of AI
The artificial intelligence industry is what scholar Kate Crawford calls an “extractive” business. In her book Atlas of AI, Crawford notes that rare earth minerals like mercury and zinc must be mined in large quantities in order to build the machines used to train and run AI models. Coal and oil must be drawn from the earth in order to power the data centers that provide the necessary computational power and resources for AI algorithms to work.
All of this comes at a cost to the environment. Not only in the form of soil and air pollution, but also carbon emissions and water use.
AI’s Carbon Footprint
“Minerals are the backbone of AI, but its lifeblood is still electrical energy,” Crawford wrote. “In reality, it takes a gargantuan amount of energy to run the computational infrastructures of Amazon Web Services or Microsoft’s Azure, and the carbon footprint of the AI systems that run on those platforms is growing.”
A 2022 research paper published by AI startup Hugging Face found that making its language model BLOOM resulted in more than 50 metric tons of carbon dioxide emissions, roughly the equivalent of 60 flights from London to New York. And that’s significantly less than the emissions associated with other language models.
Another paper, published in 2019 by researchers at the University of Massachusetts, Amherst, found that training just one AI model can emit more than 626,000 pounds of carbon dioxide — nearly five times the lifetime emissions of the average American car, including the manufacturing of the car itself.
The paper focused specifically on the model training process for natural language processing, a subfield of AI that teaches machines how to handle human language, used to create large language models like GPT-3 and GPT-4 (powering OpenAI’s ChatGPT) and PaLM2 (powering Google’s Bard).
“The larger the models are, the more energy intensive they are.”
“You need huge amounts of data,” Kasper Groes Albin Ludvigsen, a data scientist and green AI advocate, told Built In. “The larger the models are, the more energy intensive they are.”
The problem doesn’t stop at training AI either. According to Ludvigsen, actually using an AI system emits much more carbon dioxide than training one.
Researchers at UC Berkeley and Google found that it took 1,287 megawatt hours of electricity to train GPT-3, which is enough energy to supply an average U.S. household for about 120 years. Meanwhile, back when ChatGPT was still using GPT-3 (around March of 2023), the chatbot received millions of daily queries, equating to what he estimates to be about 4,000 to 6,000 megawatt hours of electricity — three to five times what it took to actually train GPT-3 in the first place.
“It’s not just about one company and their one model. Many companies are developing these types of models,” Ludvigsen said, adding that, as AI adoption increases, so too will the environmental costs of maintaining it.
AI’s Water Footprint
Training AI also consumes a lot of water, which is pumped in and out of data centers to prevent servers from overheating.
One 2023 study found that training GPT-3 in Microsoft’s U.S. data centers can directly consume 700,000 liters of fresh water a day. And Google’s U.S. data centers consumed 12.7 billion liters of freshwater for on-site cooling in a single year, 90 percent of which was potable water.
Water is also necessary in actually producing graphics processing units (GPUs), which are used in the AI training process. According to Shaolei Ren, an associate professor of electrical and computer engineering at University of California Riverside, who co-authored the 2023 paper, a typical semiconductor factory will consume millions of gallons of water a day, and that water needs to be ultra pure because it is used for cleaning.
AI’s massive consumption of water is a problem when you consider all the places where water is a highly limited resource. This is not only true in the making of AI, but also in its daily use.
For instance, in order to enable the rapid responses of ChatGPT, OpenAI has to deploy servers all over the world — some of which are in regions where water is scarce. This is continuing to cause “tension” with local water users in these areas, Ren told Built In. “This is the reason why we have to pay attention to AI’s water footprint.”
AI as a Solution
Still, the use of AI can also do a lot of good. A 2020 research paper determined that, of the 169 targets the United Nations laid out in its Agenda for Sustainable Development to solve issues like world peace and gender equality, 79 percent could be significantly aided by the use of AI — particularly as it relates to the economy and the environment.
In fact, the nonprofit Climate Change AI published a report in 2022 explicitly outlining some of the ways artificial intelligence can be used to address issues related to climate change. From optimizing power grids and enabling smart buildings, to enhancing transportation systems and facilitating precision agriculture, AI has shown to be a powerful tool in helping to reduce global greenhouse gas emissions and foster energy efficiency.
And AI’s unique ability to make predictions based on the trends and patterns it spots in large quantities of data make it good at providing insights in areas that are otherwise uncertain, like climate change. AI’s predictive capabilities can be used in everything from climate modeling to the development of mitigation policies — allowing scientists to ask “what if” questions, and giving policymakers the knowledge they need to to weigh the costs and benefits of a particular strategy.
“It is a tool that, alongside engineering, the social sciences and policy, needs to be part of the mix.”
“AI is really good in those cases where you need to do something at a larger scale, or at a faster time granularity, using a lot of data,” Priya Donti, co-founder and executive director of Climate Change AI, told Built In. And while it won’t solve every single problem related to climate change, “it is a tool that, alongside engineering, the social sciences and policy, needs to be part of the mix.”
But this begs the question: Do the environmental benefits of AI outweigh the accompanying risks? Yes, artificial intelligence is being used to track and mitigate climate change, but it is also being used to accelerate oil and gas extraction. And while implementing AI can certainly save organizations time and money, the ends may not always justify the means.
The Path to a Greener Future
Work is being done to reduce AI’s carbon and water footprints, allowing it to be a more sustainable tool in the fight against climate change.
Tech giants like Google and Microsoft have pledged to power all their operations, including data centers, with renewable energy within the next decade or so. And some data centers have started combining their normal power sources with renewable energy sources like solar panels, and have scheduled accordingly so that they are operating at optimal times — like when the sun is highest in the sky (research shows this to be an effective method of reducing carbon emissions).
UC Riverside professor Ren said a similar tactic can be used to reduce water consumption, since water efficiency is so dependent on time and location. If a data center is running mid-day, when temperatures are at their highest, water efficiency is pretty poor because more is required to keep things cool. And a data center in Arizona is much less water efficient than one in Washington state, because of the differences in climate.
These temporal and spatial differences can actually be harnessed by companies to foster more efficient water use, Ren continued. Training can be done in Nordic countries or Canada instead of Arizona, for example. Or training can be done at night when temperatures are cooler. According to Ren, that would “easily save us some water.”
AI engineers and researchers are also making headway in creating more environmentally friendly models, like transformers (the “T” in GPT), which are able process more data in less time, thereby reducing energy consumption. And recent research has found that tweaking the settings of the cloud service an algorithm runs on during training can actually make the process much more energy efficient.
There’s also a growing movement to design more robust models with less data, and therefore less energy, in a technique known as frugal AI. This may involve using less data and fewer features to make predictions, or limiting the amount of computational power or memory resources used in training and deploying a model.
Looking ahead, the real linchpin of a more energy efficient AI industry will be accurate measurements and transparency, said AI expert Bernard Marr in a Forbes op-ed. Companies like Salesforce and Microsoft have rolled out their own tools to help businesses visualize and understand their own carbon footprints as a whole; and the Machine Learning Emissions Calculator can help businesses estimate the carbon footprints of their AI models specifically, looking at factors like cloud provider, geographic region and hardware used. All of this can then help businesses keep up with changing governmental and societal expectations about the AI industry’s role in environmental sustainability.
“We need to really shape and steer the development of AI in ways that are fundamentally aligned with what we need for climate change-related problems, rather than climate change-related problems being an afterthought,” Donti said. “We can’t just develop AI for business as usual use cases and then hope that the climate change space benefits from it.”