Responsible AI

The environmental impact of AI and how to mitigate it

climate impact ai
  • Blog
  • 8 minute read
  • October 10, 2025

Welcome to our series on the interaction between AI and environmental, social and governance (ESG) principles. Our goal is to provide you with a deep dive into different perspectives on the impact of AI, addressing a responsible way forward. After delving into different aspects of ESG in the first three articles, starting here with the environmental impact, the last article discusses a C-Suite playbook for responsible AI. Stay tuned for the following articles!

1 The insatiable nature of AI

The conversation around artificial intelligence often focuses on algorithms and capabilities. Yet, its most immediate global impact may be measured in megawatts, cubic metres and tonnes. The exponential growth of AI is fuelling an unprecedented expansion of data centres, creating a surge in demand for electricity that is straining power grids. This energy thirst is matched by an insatiable consumption of fresh water, pitting tech giants against local communities in an intensifying competition for resources. We examine the compounding impacts of AI’s energy use, water footprint and e-waste and hold it up to the light.

1a Energy

The most scrutinised environmental impact of AI is its immense consumption of electricity, which translates directly into carbon emissions depending on the energy mix of the local grid and data centre in question. This consumption occurs across the AI lifecycle but is most concentrated in two phases: the initial, intensive training of a model and its ongoing, widespread use for inference (see figure 1).

11.5%

Data centre construction and hardware embodied impacts

Infrastructure
85.5%

Model training, inference and network traffic

Computing
3%

Embodied impacts of end-user equipment and power consumption

Usage

Figure 1: Breaking down the greenhouse gasses produced by AI.


Training a large, foundational AI model is a one-time event that requires an extraordinary amount of computational power and energy. This process involves feeding the model trillions of data points over weeks or months, allowing it to learn the patterns and relationships that underpin its capabilities. While model training’s impact is significant, the ongoing operational use of AI models (a process known as inference) constitutes the bulk of their environmental impact over their lifecycle. Inference occurs every time a user queries a model, generating a small but cumulative energy cost that scales with adoption. This high and continuous energy demand from inference is a primary driver of the explosive growth in data centre power consumption (see figure 2).

6,702

Joules

Large text model 405B parameters
4,402

Joules

1024x1024 pixels image
3,400,000

Joules

5 second video

Figure 2: Average estimated energy consumption1.


While these figures give an idea of the differences between types of output, in practice there are very high variations caused by:

  • Size of the chosen model

  • Number of iterations

  • Task-specific traits (for example, context length) 

The situation in Europe

The European data centre market is undergoing a period of unprecedented expansion, transforming from a critical piece of IT infrastructure into a foundational pillar of the continent's digital economy. Simultaneously, the continent is focussing on data centre sustainability, according to the European Data Centre Association’s (EUDCA) inaugural ‘State of European Data Centres’ report2.

  • Projected €100 billion in investment for European Data Centres by 2030.

  • Annual demand growth of 15% per year between 2023 and 2030.

  • In Belgium specifically, the estimated compound annual growth rate for 2023-2030 is 23%, increasing from 136 MW in 2025 to 327 MW in 2030.

  • The share of Hyperscaler-owned IT power supply is expected to decrease by 12%, from 54% in 2025 to 42% in 2030.

  • 94% of energy used in the surveyed EU data centres comes from renewable sources3.

The intense growth of the data centre market is not speculative; it is a direct response to a tidal wave of demand from several overlapping technological revolutions – the simultaneous, not sequential, waves of cloud computing, 5G networking and artificial intelligence – resulting in compounded growth. 

 

Starting from January 1, 2025, the directive makes the measurement and reporting of power usage effectiveness (PUE) mandatory for all data centres across the EU. Furthermore, this data will not be kept private; it must be submitted to a publicly accessible European database.

Today, available data to correctly assess the environmental impact of AI remains limited. The European Union's updated Energy Efficiency Directive (EED) is poised to transform this landscape. Starting from January 1, 2025, the directive makes the measurement and reporting of power usage effectiveness (PUE) mandatory for all data centres across the EU. Furthermore, this data will not be kept private; it must be submitted to a publicly accessible European database.

  • The PUE for European Data Centres (surveyed in 2024) is 1.48 with targets set at 1.30 or lower for new facilities in cool climates and below 1.40 in warm climates. 

  • This means for every Watt powering IT equipment, an additional 0.48 Watt is required to power auxiliary cooling and other facility systems.

  • This figure is slightly lower than the US, which averaged around 1.56 in 2024.

  • Hyperscalers in both regions have achieved PUEs as low as 1.10 For example, in its 2025 Environmental Report Google reports a PUE of 1.08 for its data centre in Saint-Ghislain, Belgium4.

How AI can enable smarter grid management

Europe's commitment to climate neutrality by 2050 demands a fundamental transformation of its energy infrastructure. Modernising the grid to intelligently integrate vast renewable energy sources and robustly manage decentralised production is paramount. Here, AI and Generative AI (GenAI) emerge as powerful catalysts, revolutionising Europe's energy systems. Through advanced optimisation of renewable integration and enhanced grid management capabilities, AI is not only accelerating the EU’s climate goals but also fortifying its global competitiveness5.

1b Water

The scale of water consumption for both training and inference is startling. Accurate and recent numbers from AI model providers on water consumption are scarce. Mistral reported the environmental footprint of training Mistral Large 2: as of January 2025 and after 18 months of usage, Large 2 generated the following water consumption impacts6:

  • 281,000 m3 water consumed

  • 45 mL of water for a 400-token response during model inference

A good proxy is to look at water consumption figures reported by the Hyperscalers. Looking at the combined water consumption of Microsoft and Google, as reported in their 2025 sustainability reports7, we see an increase in total water withdrawn between 2020 and 2024 of 76.6%. On average, 68% of the withdrawn water is consumed. In 2024 this meant 36.6 million m3 of water was consumed by Microsoft and Google together, up from 18.4 million m3 in 2020 (+99%).

1.93

Million m3

Total withdrawn
0.05

Million m3

Portable withdrawn
1.49

Million m3

Consumed

Figure 3: Water consumption for Google’s Data Centre in Saint-Ghislain, Belgium8.


In some areas, this puts data centre water consumption in direct competition with municipal and agricultural water needs, often in regions already experiencing water stress. Google reports 14% of their total water consumption is from sources at high risk of depletion or scarcity, with another 14% at medium risk. Microsoft does not disclose this information as part of its sustainability report.

A critical but often overlooked complexity is that optimising for carbon efficiency does not guarantee water efficiency. In fact, the two can be in direct conflict. For example, a data centre might shift its computational workloads to a time of day when the electricity grid is powered by low-carbon sources. However, if this time coincides with the hottest part of the day, the water required for cooling could increase dramatically. This misalignment means that a company's efforts to reduce its carbon footprint could inadvertently worsen its water footprint, creating a difficult trade-off for sustainability managers9.

Improving Europe’s precision agriculture with AI

AI is poised to revolutionise agriculture, particularly for the optimisation of water usage, a critical resource in many regions. AI, supported by remote sensing, Earth Observation systems and innovative on-field sensors, enables precision agriculture by delivering highly granular insights into crop needs. This allows for predictive analytics that can precisely adjust watering schedules based on forecasted rainfall and assess environmental risks, significantly reducing unnecessary water consumption. Projects like the EU-funded AgriBIT demonstrate this potential, showing how combining AI with precise satellite and sensor data can lead to substantial improvements in resource efficiency, allowing farmers to monitor and predict crop conditions to optimise water resources and foster more sustainable farming practices10.

1c Hardware and e-waste

The rapid pace of innovation in AI is creating a powerful engine for electronic waste (e-waste) generation. The relentless pursuit of more powerful models drives a short upgrade cycle for hardware, rendering expensive servers and accelerators obsolete in just a few years.

  • Global e-waste volume 
    The world generated a record 62 million tonnes of e-waste in 2022, a figure that is growing rapidly. Critically, only about 20% of this waste is formally collected and properly recycled, with the rest often ending up in landfills or being illegally shipped to developing countries where informal and hazardous recycling methods prevail11.

  • Material consumption 
    According to Mistral’s study on its Large 2 model lifecycle12, the manufacturing, transportation and end-of-life of the server infrastructure used accounts for 61% of all materials consumed in the model’s lifecycle.

  • AI as an Accelerator 
    Researchers at the Chinese Academy of Sciences project that the expansion of AI will increase the amount of global e-waste by 3% by 2030. This translates to an additional 2.5 million metric tons of e-waste annually, an amount one researcher has equated to discarding 13 billion iPhones each year. This is driven by the prediction that tech companies will replace the components in their AI systems every three years to keep up with performance demands. This rapid obsolescence, combined with a lack of transparency (only 25% of data centre operators measure their e-waste), means that tons of valuable and often hazardous materials are being discarded without proper oversight13.

A critical paradox emerges from the data: while individual AI tasks are becoming vastly more energy-efficient through innovations in software and hardware, the aggregate energy consumption of the AI sector is exploding. This phenomenon suggests that efficiency gains, by making AI more accessible and affordable, are paradoxically fuelling a surge in overall demand that outpaces the savings from optimisation. This is a classic example of the Jevons Paradox, where the technological progress that increases the efficiency with which a resource is used, tends to increase (rather than decrease) the rate of consumption of that resource. 

2 Mitigating AI’s environmental impact

While it might seem that the environmental impact of AI is high—and likely to stay that way, there are some approaches that we can take to improve the sustainability of AI. By rethinking the way that we use the tools we already have, for example by reducing our insistence that larger, centralised models are the only way forward, we will start to lower the environmental impact of each AI inference. However, this is only part of the story: we still need to tackle the accelerating growth of AI inferences—something that will require a commitment to responsible innovation from all of us. 

2a Optimising AI performance

As AI continues to redefine how we do business, optimising its operational footprint is critical. For example, take large language model (LLM) inferencing—it offers a powerful capability, but its energy and cost demands can be significant. We’re seeing innovative approaches emerge to address this. Microsoft's DynamoLLM, for instance, is a pioneering energy-management framework that dynamically reconfigures inference clusters to dramatically optimise performance. This can lead to impressive outcomes: conserving energy usage by 53%, cutting operational carbon emissions by 38% and reducing costs by 61%, all while meeting critical latency requirements14. That means you can deploy powerful LLM solutions more sustainably and cost-effectively, driving efficiency across your operations.

For many specialised applications, particularly with the rise of agentic AI systems, small language models (SLMs) are proving to be a game-changer.

Another strategic shift is rethinking the "bigger is better" mindset for AI models. While LLMs offer broad generalist capabilities, their sheer scale isn't always the most efficient path. For many specialised applications, particularly with the rise of agentic AI systems, small language models (SLMs) are proving to be a game-changer. These focused models are sufficiently powerful, inherently more suitable and significantly more economical for performing specific, repetitive tasks. This approach allows you to tailor AI deployments to precise business domains, from finance to human resources, or within specific industries like financial services or healthcare—so you can achieve targeted performance with optimal resource utilisation. Embracing SLMs means accelerating your journey toward a more agile and cost-efficient AI future15.

2b Powering progress

While techniques like quantisation, pruning and the use of smaller, specialised models can dramatically reduce the energy required for individual AI tasks—by up to 90%—this isn’t the full picture. The International Energy Agency (IEA) projects that aggregate data centre energy demand will more than double by 2026. This surge is largely driven by newly accessible AI capabilities. The lower cost and ease of adopting open-source models make AI more widespread, leading to a net increase in compute cycles and resource use.

This dynamic means technical solutions for efficiency, while crucial, aren't enough on their own. A truly sustainable AI strategy can't just focus on making AI "greener" per task; it must also include governance and policy measures to manage the overall scale of AI deployment and consumption. Without addressing this growth in demand, efficiency gains risk being entirely negated by the sheer volume of new applications, making it vitally important to define a holistic strategy. 

Beyond core efficiency, the industry is making a powerful shift toward renewable energy. To meet sustainability goals and mitigate exposure to volatile fossil fuel markets, operators are aggressively securing clean power, primarily through long-term power purchase agreements (PPAs). This strategic move ensures a more resilient and environmentally sound energy future.

Data centre operators are also exploring a range of innovative solutions to manage their energy footprint:

  • Waste heat reuse 
    A data centre is, fundamentally, a large heat producer. The concept of district heating, where this waste heat is captured and repurposed for nearby residential buildings, offices or even swimming pools, is gaining substantial traction. This bold approach transforms an environmental challenge into a community asset, becoming increasingly common across the Nordics and parts of Western Europe.

  • Alternative fuels 
    Diesel generators for backup power are a notable source of on-site emissions. Operators like Digital Realty are pioneering the use of hydrotreated vegetable oil (HVO 100). This renewable diesel, derived from waste fats and oils, serves as a drop-in replacement for conventional diesel, now deployed across 30 of the company's global sites. This demonstrates a clear commitment to sustainable alternatives.

  • Advanced cooling techniques 
    The adoption of "free cooling" systems, which leverage cool ambient air, is a widespread practice. Operators like Colt Data Centre Services report utilising these intelligent techniques for the majority of the year at their Paris data centre, maximising natural advantages to significantly reduce energy consumption.

2c Decentralising intelligence

One significant challenge in managing AI’s energy footprint is the constant data transmission between devices and centralised cloud data centres. Edge AI powerfully addresses this by shifting computation to the edge of the network. This means processing data locally, directly on or near the device where it’s generated, radically cutting down the need for distant data centres.

By processing data locally, Edge AI can dramatically reduce network traffic of tokens between user devices and inference. This bold shift not only conserves the energy tied to data transmission but also enhances speed, strengthens security and protects privacy. We’re talking about tangible benefits that propel your operations forward.

The powerful synergy of SLMs and increasingly robust consumer hardware is making this scenario highly feasible for a wide range of GenAI use cases. This means you can deploy sophisticated AI solutions closer to the action, unlocking new possibilities for efficiency and innovation.

3 Looking forward

As we've discussed, the exponential growth of AI brings significant environmental considerations, from escalating energy and water consumption in data centres to the challenge of e-waste from rapid hardware refresh cycles. Yet, this complexity is met with groundbreaking innovations. We've seen how advancements in model optimisation, such as the strategic use of SLMs and sophisticated inferencing techniques, are driving remarkable efficiency gains. Concurrently, data centre operators are transitioning to renewable energy sources, implementing waste heat reuse and pioneering advanced cooling to reduce their footprint. The emergence of Edge AI further shifts the paradigm, decentralising computation to slash data transmissions and enhance local processing capabilities. This journey towards sustainable AI is a collective one, where every practitioner plays a role. By consciously choosing the right-sized model for specific tasks, for instance, individuals can directly contribute to more resource-efficient AI deployments, embodying a commitment to responsible innovation.

By consciously choosing the right-sized model for specific tasks, for instance, individuals can directly contribute to more resource-efficient AI deployments, embodying a commitment to responsible innovation.

This deep dive into AI's environmental impact is just the beginning of our comprehensive exploration into its role in environmental, social and governance principles. While the technological solutions for a greener AI are rapidly evolving, a truly sustainable future demands a broader perspective. In our upcoming articles, we will shift our focus to how AI is shaping customer and employee engagement, the critical aspects of trustworthy AI and the imperative of effective AI governance in light of evolving regulations like the EU AI. In the last article, we will look at how to empower leaders to strategically navigate this transformative era and build an AI future that is both powerful and profoundly sustainable.

Contact us

Xavier Verhaeghe
Xavier Verhaeghe

Partner Technology Consulting & Innovation, PwC Belgium

Michiel De Keyzer
Michiel De Keyzer

Director, PwC Belgium

Connect with PwC Belgium