What about the environmental impacts of AI?

Like any industrial activity, the use of AI has an environmental cost. The two concerns that come up most often are greenhouse gas (GHG) emissions through electricity generation and water usage.1 How do these impacts compare to other human activities, both at the global and local scale?

Methodology

The majority of the environmental impact of AI comes from the environmental impact of the datacenters used for training and inference. We can approximate the impact of AI by determining the total impact of all datacenters, then adjusting for the proportion of resources used by AI in these datacenters.2

GHG emissions from datacenters

Datacenters use substantial amounts of electricity both to run the computers and to cool them. Some of that electricity comes from non-renewable sources, which have high carbon footprints. How big is that impact?

Sources disagree about the current impact of datacenters GHG emissions. One source estimated that in 2023 datacenters were using 4% of US electricity and producing 2% of its GHGs. Another source claims that in 2021, datacenters were responsible for about 1% of the global GHG emissions. For comparison, aviation produces 4% of global GHGs. The International Energy Agency looked at the numbers for datacenters and AI, and was mostly unconcerned.

Water usage from datacenters

Datacenters can use a lot of water for their operations, mostly for cooling through water evaporation.3 Furthermore, the water used to generate the electricity they use is usually accounted for in total water usage. A substantial amount of the water used is potable water, although some greywater is sometimes used.

The worldwide water draw of datacenters is around 500 billion litres, which puts it at about 0.02% of global freshwater withdrawal. Globally, this is a small amount, but individual datacenters are still large users and can strain the water supply of neighboring towns.

AI use in datacenters

Not all of the resource use of datacenters is used to power AI, and it is hard to estimate what percentage of compute usage is used to power AI, but people agree that it is growing. Alex de Vries estimated that in 2023, total energy use of AI was under 2.5% of datacenter energy use. Another source estimated it to be of the order of 2% in Q1 2024, but it could grow to 7% by 2025.

These numbers suggest that to account for AI specifically, we should consider the impact of AI on energy and water use to be about one order of magnitude smaller than the total impact of datacenters.

Individual use

Such global numbers are not particularly helpful at determining the individual impact of using AI. For instance, should you avoid using an LLM to help you write an email because of the environmental impact?

For popular models, the total use of compute for training is generally similar to the amount used for inference, so as a first approximation of the energy cost of using the models, we can look only at the latter. GPT-3 has been reported to consume 0.004kWh per page of text produced, which translates to about 17 ml of water use and 2 grams of CO2-equivalent.4

Is this a lot? Not really. The one-page request to GPT-3 uses about one tenth of the energy needed to boil a cup of water, and has similar GHG emissions. The GHG emissions are comparable to travelling 10m in a gas-powered car.

For water, this 17ml is about 15 000 times less water than growing a head of lettuce, and about a million times less than producing a kg of beef or chocolate. It is also dwarfed by the water needed to produce everyday items such as the 2700L needed to produce a t-shirt.

So, all in all, the environmental impact of AI is quite small both at the global and individual scales. Still, datacenters can have a large impact on the communities near them, and if AI were to take up a lot more space in our society, the global impacts might become meaningful.


  1. Other examples include land use, mining for materials and electronic waste. ↩︎

  2. We don’t distinguish here between compute, electricity use, cooling, and material use, and assume AI has a similar profile to other uses. ↩︎

  3. Some water is recirculated for cooling, we do not count this as water loss here. ↩︎

  4. Sasha Luccioni found similar results for BLOOM. More research would be needed to understand the impact of more recent models, but if price per token is a good indicator of energy use, the energy cost has been sharply dropping. It’s worth noting that models that use a lot of compute at inference time such as OpenAI’s o1 and o3 may use substantially more compute. ↩︎



AISafety.info

We’re a global team of specialists and volunteers from various backgrounds who want to ensure that the effects of future AI are beneficial rather than catastrophic.