The Carbon Footprint of Machine Learning
- Milly Xu
- 7 days ago
- 4 min read
You’ve likely heard that making one hamburger takes 660 gallons of water - enough to fill 10 bathtubs. Now, a new claim is circulating: every ChatGPT prompt is equivalent to pouring out a bottle of water. While the numbers may sound exaggerated, they highlight the growing environmental footprint of machine learning, which is being rapidly revealed as artificial intelligence becomes an integral part of modern society.
Machine learning is a subset of artificial intelligence. It refers to systems that can ‘learn’ from data by identifying patterns, making predictions, and generating new output- all without being explicitly programmed for every possible task. Recently, the energy demands of large-scale AI models have made headlines. They suggest that everyday, ChatGPT uses enough electricity to charge 8 million phones (39.98 million KWh), while OpenAI’s projected energy use could soon rival the combined consumption of major cities such as San Diego and New York. But how exactly do these numbers come about, and what exactly makes machine learning so incredibly energy intensive?
Overview of the problem
AI models begin as blank networks containing internal variables, known as parameters or weights. These determine how information is processed through a method called error-driven learning. During the training phase, these parameters are repeatedly adjusted so that the model can recognise patterns and make accurate predictions when a user gives a prompt.
For example, to teach a model to recognise cars, it’s fed datasets of labelled images of “cars” and “non-cars”. It makes predictions, compares them to the correct labels, and then adjusts its parameters based on the difference, or error, in its calculations. This process repeats billions of times, requiring immense computational power, until the model is accurate.
These parallel computations are handled by graphic processing units (GPUs) or tensor processing units (TPUs), specialised chips designed to efficiently break down and solve large calculations simultaneously. Each chip consumes electricity, often consuming hundreds of megawatt hours for AI model training runs. When the energy is drawn from burning fossil fuels, it translates directly into carbon dioxide emissions. Since roughly 60% of global electricity was still sourced from fossil fuels in 2024, AI training is leaving an increasingly significant carbon footprint.
Water footprint
The environmental cost of AI is not limited to carbon emissions. Data centres, which contain the cloud servers running AI computations, generate intense heat. Think of a laptop getting hot after use, but on a massive scale. Cooling systems are necessary to prevent overheating of the hardware, and many rely on liquid cooling (for more information, visit: Basics of Liquid Cooling in AI Data Centres), in which large amounts of water is used to dissipate heat.
This growing demand for water is becoming increasingly unsustainable. What was previously a serious issue for data centres has been exacerbated by the rapid rise of machine learning. Medium-sized data centres can consume up to 110 million gallons of water annually, while larger ones can use as much as a small town of 50,000 people. In 2023, Microsoft’s data centre water usage rose by 34%, reaching 1.7 billion gallons, marking a 228% increase since 2017.
The strain is especially concerning in regions where water is already an issue, such as California or Arizona, where droughts, wildfires, and agricultural demands push water systems to their limits. California, home to the water-intensive almond growing industry in Central Valley, also contains over 300 data centres. The expansion of AI infrastructure in regions like this has intensified the competition for limited water supplies, placing data centres at odds with agricultural production and residential water use. As a result, they are diverting critical resources from ecosystems and communities already under pressure from climate change.
Conclusion
AI is revolutionising the way we work, think, and create, but it also carries a heavy sustainability cost that cannot be overlooked. From the carbon emissions of training massive models to the gallons of water needed for cooling, machine learning systems leave a significant footprint on the environment. There is, however, progress being made. DeepSeek, an AI company founded in 2023, is emerging as a more environmentally friendly alternative to ChatGPT. It utilises a technique called Mixture of Experts (MoE), where only relevant submodels are utilised for given tasks, reducing computational load. DeepSake also operates on an estimated 23,000 less Nvidia H800 chips than ChatGPT, significantly reducing its energy usage.
On the individual level, small choices matter too. It starts from the obvious, but effective actions: limit AI use for trivial tasks - write that quick email yourself without running it through ChatGPT. When searching online, add “-ai” to your search bar to avoid the energy-intensive AI Overview. And perhaps skip the “thank you” message to ChatGPT. I’m sure both the environment and Sam Altman would appreciate it.
Reference List
IEEE Spectrum (2025) The Real Story on AI’s Water Use—and How to Tackle It, IEEE Spectrum, 10 September. Available at: https://spectrum.ieee.org/ai-water-usage.
Yañez-Barnuevo, M. (2025) Data Centers and Water Consumption, EESI, 25 June. Available at: https://www.eesi.org/articles/view/data-centers-and-water-consumption.
Business Energy UK (2025) ChatGPT Energy Consumption Visualized, BusinessEnergyUK. Available at: https://www.businessenergyuk.com/knowledge-hub/chatgpt-energy-consumption-visualized/.
UNRIC (2025) Artificial intelligence: How much energy does AI use?, UN Regional Information Centre. Available at: https://unric.org/en/artificial-intelligence-how-much-energy-does-ai-use/.
Penn State Engineering (2025) Why AI uses so much energy, and what we can do about it, IEE / Penn State Engineering News. Available at: https://iee.psu.edu/news/blog/why-ai-uses-so-much-energy-and-what-we-can-do-about-it.
Flex Power Modules (n.d.) The basics of liquid cooling in AI data centers. Available at: https://flexpowermodules.com/the-basics-of-liquid-cooling-in-ai-data-centers.
.png)





Comments