Environmental Impact Background

Environmental Impact Hub

What does it cost to ask AI a question? Not in subscription fees, but in electricity, water, and rare earth minerals. As generative AI has moved from research labs into everyday use, these hidden costs have multiplied, but they remain largely invisible to the millions of people now using these tools daily.

AI technology has existed for decades, but generative AI (systems that create text, images, video, and code on demand) represents a fundamental shift in scale and accessibility. Since ChatGPT's launch in late 2022, these tools have exploded into mainstream use, with millions of people making requests that require enormous computational power. Unlike earlier AI applications that ran specific, limited tasks, generative AI models are trained on massive datasets and remain "always on," ready to generate content at any moment. The shift from specialized AI tools to general-purpose, widely accessible system has accelerated both adoption and environmental impact.

Yet conversations about AI often focus on capabilities and ethical issues like bias and privacy while environmental dimensions receive far less attention. Part of this silence stems from difficulty in measuring impact: companies rarely disclose energy consumption, researchers lack standardized metrics, and the global infrastructure supporting AI is often invisible to end users.

This resource provides a starting point for understanding AI's environmental footprint across its full lifecycle: from the mining of rare earth minerals to the disposal of outdated hardware. Our goal is to make this information accessible, allowing you to engage with these questions and make informed decisions about your own AI use.


How to use this resource:

This resource follows AI's lifecycle journey from raw material extraction through manufacturing, training, and daily use, ending with environmental justice. Environmental justice recognizes that marginalized communities (particularly communities of color, those in low-income areas, and Indigenous populations) disproportionately bear the burden of environmental hazards while having the least power in environmental decision-making. 

While the following sections build on each other to tell a complete story, you don't need to read everything or follow a linear path. Choose the sections most relevant to your interests or questions; the resource is designed for self-directed exploration. Each section includes two types of sources: sources under jumping off provide overviews without heavy technical detail, while sources under deeper dive are more technical and comprehensive, with methods and data. This is a living document, and we will continue updating it as new research emerges.

Getting Started

New to AI’s environmental impact? Start here.

The Measurement Challenge

How we track and quantify AI’s footprint.

From Chip to Landfill

Mining, manufacturing, and disposal of AI’s hardware.

Physical Foundations

Data centers, power systems, and infrastructure demands.

Training AI Models

The enormous energy costs of teaching AI systems.

AI in Daily Use

Energy consumption from everyday algorithms.

Uneven Geographic Burdens

How AI’s costs fall disproportionately on certain regions.
Abstract data visualization background

Getting Started

Before we unpack these impacts in detail, here are a few starting points for understanding the scale of AI’s physical footprint:

Abstract data visualization background

The Measurement Challenge

Unlike a car’s fuel efficiency or refrigerator energy label, AI systems don’t come with standardized environmental reporting. Companies treat energy consumption and emissions data as proprietary information. Researchers work with incomplete data and estimates. The infrastructure spans global supply chains, making it nearly impossible to track the full picture from rare earth mining to e-waste. 

This isn’t just technical, it’s also political and ethical. What we choose to measure shapes what we can address. Should we focus on energy during training, or include manufacturing emissions? Per-query efficiency, or total system impact? Without standardized metrics we can’t compare models, hold companies accountable, or decide which AI applications justify their environmental cost.

This section introduces key frameworks and terminology for understanding AI's environmental footprint. It explains why measurement remains so difficult and explores emerging efforts to create standards. Understanding these foundations is essential for interpreting environmental claims and recognizing what stays unmeasured. As measurement practices evolve, so will this resource.

Sources
Abstract data visualization background

From Chip to Landfill

Every AI model runs on physical hardware – processors, memory, storage systems – and that hardware carries substantial environmental costs long before it enters a data center. The rare earth minerals essential for modern chips come from mining operations that devastate ecosystems and exploit labor, often in regions far from where AI systems are ultimately deployed. Manufacturing semiconductors requires extraordinary amounts of pure water, toxic chemicals, and energy; production facilities consume resources at scales that strain regional infrastructure.

This material footprint is often invisible in conversations about AI's environmental impact, which tend to focus on operational energy use. But hardware manufacturing accounts for significant lifecycle emissions. Making matters worse, AI infrastructure has remarkably short replacement cycles: chips are frequently swapped out for newer, more efficient architectures after just one to three years. This creates mounting streams of electronic waste. 

Unlike operational emissions, which might be reduced through renewable energy, the material impacts of hardware production are harder to mitigate. The environmental and human costs are often externalized to communities near extraction sites, manufacturing facilities, and informal e-water processing operations. This makes AI's material footprint a matter of global environmental justice.

Sources
Abstract data visualization background

Physical Foundations

When we use AI tools, the interaction feels weightless (a text prompt followed by an instant response). But behind that seamless interface sits massive physical infrastructure: data centers filled with servers that require enormous amounts of electricity to run and even more resources to keep cool. As AI models grow larger and more widely deployed, data centers construction is accelerating globally to meet demand. 

Data centers consume staggering amounts of both energy and water. Cooling systems prevent servers from overheating, often using water-intensive evaporative cooling or drawing heavily on local electric grids. In regions already facing water scarcity or grid stress, this demand can strain local resources and communities.

Companies frequently tout their use of renewable energy or carbon offsets, but these claims deserve scrutiny. Purchasing renewable energy credits doesn’t necessarily mean a data center runs on clean power in real time; it is an accounting mechanism, not a direct connection to renewable sources. These purchases don't eliminate the strain corporate demand places on regional grids and water supplies.

Meanwhile, efficiency improvements at individual facilities are often overwhelmed by explosive growth in AI deployment. A data center might become 20% more efficient, but if AI usage doubled or triples, total resource consumption still climbs. It becomes clear that sustainable AI will require not just better technology, but fundamental limits on growth and transparent accountability for local impacts. 

Sources
Abstract data visualization background

Training AI Models

Training a large AI model is enormously resource-intensive. Thousands of specialized processors run continuously for weeks or months, consuming massive amounts of electricity and generating substantial carbon emissions. As models have grown from millions to billions to trillions of parameters, the computational requirements and environmental costs of training have increased exponentially. A single training run for a state-of-the-art model can emit as much carbon as five average cars produce over their entire lifetimes.

Training is often framed as a one-time cost: train once, then deploy it for countless users. But this understates the reality. Models are frequently retrained to improve performance, updated with new data, and fine-tuned for specific applications. Each iteration carries its own environmental cost.

The race toward ever-larger models reveals a tension. The efficiency improvements that make training feasible also enable the construction of even more resource-hungry systems. Without mandatory disclosure and environmental accounting, this cycle will likely continue until external constraints (regulatory, resource-based, climate-related) force a reckoning.

Sources
Abstract data visualization background

AI in Daily Use

While training a model happens once or periodically, inference (the process of actually using a trained model to generate responses, make predictions, or produce images) happens billions of timers per day across millions of users. Each query requires computational resources and energy. Individual interactions may seem negligible, but at the scale of widespread deployment, inference can rival or exceed training as a source of environmental impact.

This creates what economists call the "rebound effect," or Jevons' Paradox: efficiency improvements that reduce per-query energy costs can paradoxically increase total environmental impact. A model that uses half the energy per query but gets used ten times more often leads to greater environmental harm. As AI systems become more efficient and user-friendly, they integrate more deeply into everyday products and workflows, multiplying the total number of queries. 

This paradox means technological efficiency alone cannot solve the problem. We also need to consider questions of appropriate use and scale: Which applications justify AI's resource costs? Where does AI actually add value, and where is it simply convenient? Without grappling with these questions, each efficiency gain may simply accelerate AI's expansion into more domains of daily life, amplifying its overall environmental footprint. 

Sources
Abstract data visualization background

Uneven Geographic Burdens

AI’s environmental footprint is not distributed equally across the globe. The rare earth minerals that make AI hardware possible are extracted primarily in regions of the Global South, where mining operations contaminate water supplies, displace communities, and generate toxic waste with minimal accountability. Semiconductor manufacturing concentrates in areas with fewer labor and environmental protections. When hardware reaches the end of its short lifespan, much of it is shipped as e-waste to countries with limited capacity to process it safely, exposing workers and communities to toxic materials. Meanwhile, the benefits of AI development (the profits, technological capabilities, the decision-making power) concentrate in wealthy nations far removed from these harms.

Within the United States and other wealthy nations, data centers are often sited in communities with less political power to resist them. These facilities draw heavily on local water supplies and electrical grids, sometimes exacerbating existing resource scarcity. Virginia, for example, hosts the world's largest concentration of data centers, creating unprecedented strain on the state's energy infrastructure and raising question about whose needs are prioritized when resources become scarce.

The communities most burdened by AI's material demands are frequently the same communities least responsible for driving AI adoption and most vulnerable to climate impacts that AI's energy consumption helps accelerate. This reflects deeper patterns of environmental injustice: decisions about where to externalize costs and whose environments become "sacrifice zones" for technological progress. These aren't accidental outcomes; they follow historical patterns of how industrial development has distributed benefits and burdens along lines of race, class, and geography.

Sources
Last Updated
November 2025