
When ChatGPT was first released in late 2022, I was still completing my undergraduate degree in chemical engineering. For fun, my friends and I tried using it to solve some of our thermodynamics homework. The answer that ChatGPT spit out seemed correct at a glance — it used equations we had discussed in class and covered familiar topics. However, when we went through the solution line by line and compared it to the answer key, it was completely erroneous. I immediately became skeptical of ChatGPT’s usefulness for substantial tasks.
Nevertheless, as generative artificial intelligence (GenAI) has rapidly evolved over the past few years, it has changed how many of us operate. As users and programmers continue to train AI models and developers equip new generations with additional features, GenAI has become more accurate and useful for everyday tasks. Many organizations and companies, including AIChE, have offered training that teaches their employees how to use AI to make the workplace more efficient. For many, AI has become indispensable. Unfortunately, as with all new technologies, GenAI (and AI as a whole) comes with a set of ethical dilemmas. In addition to privacy, transparency, and security fears, a new apprehension has risen among scientists and climate-aware individuals: the environmental impact of AI.
AI model training involves using repeated computations to adjust billions of parameters, which is a process that requires an immense amount of power. This process uses high-performance computing (HPC) infrastructure, which requires thousands of tensor processing units (TPUs), graphics processing units (GPUs), and central processing units (CPUs) that all run in parallel. This equipment is stored in data centers, which, on average, span roughly 100,000 ft2. However, these centers can be massive, with the largest U.S. data center (owned by Meta) spanning 4.6 million ft2 (1).
The environmental impact of data centers
There are three primary ways that data centers negatively impact the environment. First, and potentially the most obvious, is their exorbitant energy usage. Much of this energy is fossil-fuel-based electricity, significantly contributing to greenhouse gas (GHG) emissions. By 2026, the world’s data centers are expected to double their electricity demand from 2022, consuming over 1,000 TWh of electricity — a value roughly equivalent to the energy usage of the entire country of Japan (2). In the U.S., data centers are expected to make up 6% of the nation’s total electricity usage by 2026, which could overwhelm current grid infrastructure (3).
Secondly, data centers generate large amounts of heat, requiring evaporative cooling processes that deplete the world’s limited freshwater supply. In 2022, Google’s data centers consumed roughly 5 billion gallons of freshwater — a 20% increase from 2021. Similarly, Microsoft reported that its water consumption increased by 34% from 2021 (4). While not all of this is solely due to AI (data centers are used for various purposes), the numbers are still staggering. Evaporative cooling is also employed during the production of microchips. Furthermore, freshwater is evaporated during some types of electricity generation, such as in thermoelectric and hydroelectric plants, which are used to supply power to the grid that data centers run on.
The third major environmental impact of data centers is the high generation of electronic waste (e-waste) due to the short lifespan of GPUs and other HPC components (5). A new study conducted by researchers in China and the U.K. (6) projects that GenAI could account for 1.2–5 million metric tons (m.t.) of e-waste by 2030, adding to the current global amount of around 60 million m.t. This could continue to exacerbate the dilemma of post-use disposal of e-waste, furthering safety concerns stemming from the unsafe disposal of hazardous metals, especially in low-income countries and communities. Improper disposal or recycling of e-waste can also contaminate soil and water, posing agricultural and health risks.
AI has elevated the discussion around environmental equity and what the price for progress might be. Like many environmental issues, the effects of AI do not equally impact all regions and communities. In addition to low- and lower-middle-income countries bearing the brunt of e-waste disposal-related problems, lower water availability and higher energy costs could disproportionately burden communities that are particularly vulnerable to adverse environmental impacts. Many data centers are located in rural areas with low land costs and plenty of room to expand. These areas often coincide with arid regions with water supplies that are already stressed by their growing populations, such as Arizona, Texas, and Chile. Additionally, the consumption of fossil fuels and the resulting air pollution by data centers vary greatly. For instance, in 2022, Google’s Finland data center operated on 97% carbon-free energy, in contrast to the 4–18% carbon-free energy in its Asian data centers (3). Moreover, the expansion of data centers and an increase in AI development could worsen socioeconomic disparities between regions — both globally and domestically — as increased electricity prices could raise local energy costs for communities that can’t easily afford higher living costs.
Like many of the environmental problems of the 21st century, there is no one-size-fits-all solution to lessening AI’s environmental impact. However, certain steps can be taken by institutions, corporations, governments, and users to decrease AI’s energy consumption, water intensity, and waste production and strive for equitable impacts across communities worldwide.
Given GenAI’s relative novelty, there is still a lot to learn about its environmental impact. Therefore, it’s important that institutions continue to prioritize research that quantifies AI’s impact. More robust quantification methods will allow for effective impact mitigation. Additionally, training AI algorithms to be more accurate will lead to improved energy efficiency. When corporations choose to purchase AI models for their employees, they should optimize their model selection. AI models can come in various sizes, and the high computational power requirements of a large model might not be justified for all business needs. For example, large language models (LLMs) like ChatGPT and Bard are composed of over 100 billion parameters, which are the weights and variables that determine AI’s input response. On the other hand, phi-1.5 has only 1.3 billion parameters — nearly 100 times smaller — but performs similarly to much larger LLMs for many tasks (7). Technical advisers and executive staff should consider researching which model is best for their company before purchasing a program. Doing so will not only reduce their environmental impact but likely lower costs as well.
The role of governments
Both federal and state governments have significant opportunities to ease AI’s contribution to climate change. The United Nations Environment Programme (UNEP) recommends four critical steps for governments worldwide: establish critical procedures for measuring AI’s environmental impact, develop regulations that require disclosing the direct environmental consequences of AI, encourage companies to offset their carbon emissions and build green data centers that use renewable energy, and set environmental policy that specifically discusses AI. Governments can also dedicate grant funding to AI efficiency and climate impact quantification research projects. Finally, governments should also continue to invest in renewable energy. Decreasing the reliance on fossil fuels will help mitigate GHG emissions from the grid as data centers continue to consume large amounts of energy.
The role of users
Although it might seem small, the user does have some power to reduce AI’s effects on the environment. It isn’t logical, nor is it practical, to stop using AI as a whole, but users can be more strategic when using AI. Training yourself in prompt engineering can significantly minimize your trial-and-error communication with your chosen AI tool, making your time more efficient and cutting your energy usage. (It’s not really necessary to send a separate message thanking your AI tool, either.) Furthermore, not all queries require AI; sometimes, traditional software is a more streamlined approach for getting your desired result. Consider educating yourself on the most effective uses of AI and GenAI and when a spreadsheet or script may suffice.
If AI is being used to solve environmental problems, such as making buildings more energy-efficient or monitoring deforestation, it is both furthering the advancement of solutions to climate change and also hindering progress by producing additional GHG emissions and waste. At the moment, there is no way to quantify the scale of its positive vs. negative impacts. That being said, justifying AI’s positive uses should not diminish what needs to be done to decrease its negative impacts. As AI and other technologies continue to develop in future years, it will be imperative to view them through an environmental lens to understand how they may threaten global sustainability and determine how individuals, governments, and researchers can work together to alleviate the new problems they generate.
- Mahan, J., “Largest Data Centers in the U.S.: A Comprehensive Overview,” https://cc-techgroup.com/largest-data-centers-in-the-us (accessed June 2, 2025).
- International Energy Agency, “Electricity 2024 – Analysis and Forecast to 2026,” https://www.iea.org/reports/electricity-2024, IEA, Paris, France (Jan. 2024).
- Ren, S., and A. Wierman, “The Uneven Distribution of AI’s Environmental Impacts,” Harvard Business Review, https://hbr.org/2024/07/the-uneven-distribution-of-ais-environmental-impacts (July 15, 2024).
- Berreby, D., “As Use of A.I. Soars, So Does the Energy and Water it Requires,” Yale School of the Environment, https://e360.yale.edu/features/artificial-intelligence-climate-energy-emissions (Feb. 6, 2024).
- Kandemir, M., “Why AI Uses So Much Energy — And What We Can Do About It,” Penn State Institute of Energy and the Environment, https://iee.psu.edu/news/blog/why-ai-uses-so-much-energy-and-what-we-can-do-about-it (Apr. 8, 2025).
- Wang, P., et al., “E-Waste Challenges of Generative Artificial Intelligence,” Nature Computational Science, 4, pp. 818–823 (Oct. 2024).
- Leffer, L., “When It Comes to AI Models, Bigger Isn’t Always Better,” Scientific American, https://www.scientificamerican.com/article/when-it-comes-to-ai-models-bigger-isnt-always-better (Nov. 21, 2023).
This article originally appeared in the Emerging Voices column in the July 2025 issue of CEP. Members have access online to complete issues, including a vast, searchable archive of back-issues found at www.aiche.org/cep.