AI Data Centres and Sustainability: Tackling Energy and Water Challenges
As Artificial Intelligence (AI) continuously reshapes many industries worldwide, the infrastructure required to support the rapid growth presents significant environmental challenges. At the heart of this challenge are AI data centres, massive facilities that house the computing power needed to train and deploy AI models at scale.
These centres are critical to modern AI advancements, but they come with substantial energy and water costs. For companies like us, understanding and addressing these impacts is essential to balancing innovation with sustainability.
The Energy Bottleneck: How Much Power Does AI Use?
AI workloads, especially in training larger models, are growing at an unprecedented rate. However, the true bottleneck in this is not in the processors powering the AI, but rather in the energy required to run these systems. AI data centres are expected to consume 90 terawatt-hours (TWh) of power by 2026, equating to around 10 gigawatts (GW) of critical IT power capacity. This places immense pressure on energy grids, with many regions already struggling to meet demand.
The rapid growth in AI has led to a surge in the construction of data centres and now they require much more than just raw computing power. AI servers, like Nvidia’s DGX H100, consume around 10 kilowatts (kW) of power each. Large-scale training clusters, which may contain thousands of GPUs, can demand over 28 megawatts (MW) of power, enough to power a small city.
Cooling The Heat: How Much Water Does AI Use?
AI data centres consume significant amounts of water for cooling purposes, particularly in regions with hot climates where vast amounts of water are necessary to maintain stable temperatures. Unlike electricity usage, this is often overlooked.
A 2021 study revealed that Google’s U.S.-based AI data centres alone consumed 12.7 billion litres of water for cooling, 90% of which was drinkable. The water footprint of AI is staggering – interactions with AI systems, such as asking a chatbot 40 questions, could actually consume up to one litre of potable water. The more intense the workload, the higher the external temperature, the more water is required to cool the equipment.
This presents a growing challenge as climate change increases the frequency and severity of droughts. In 2022, Thames Water (UK) raised concerns over data centres in London using potable water for cooling during a drought. As global temperatures rise, the water demands of AI could exacerbate water scarcity, particularly in regions already experiencing shortages.
Geopolitics, Power and Sustainability
The location of AI data centres plays a crucial role in determining both their energy and water footprints. In regions like the United States, where electricity is affordable and natural gas is abundant, AI data centre growth is accelerating. Meanwhile, countries like Japan, South Korea and China face higher energy costs due to their reliance on imported coal and gas, making it harder for them to meet the power demands of AI.
Data centres in cooler climates or regions with less water stress can reduce their cooling needs. But in areas where water scarcity is a pressing issue, the strain on local water resources can create ethical concerns about prioritising data centre operations over the needs of communities.
Moreover, AI data centres in these regions are not just challenged by energy costs. China, for example, generates 60% of its electricity from coal, making the environmental impact of AI data centres even more severe. In contrast, the United States and parts of the Middle East, which have access to both low-cost electricity and renewable energy sources, are better positioned to meet the growing power demands sustainably.
The Path Forward: How Can We Reduce AI’s Environmental Impact?
To address the environmental impact of AI, the industry has to prioritise sustainability in both energy and water usage. Innovations such as liquid cooling systems can reduce the need for water, while renewable energy sources can lower the carbon footprint of data centres. For example, new liquid cooling technologies can decrease per-rack power usage by up to 10%, eliminating the need for traditional air cooling and reducing overall energy consumption.
Additionally, where AI workloads are processed really matters.
Data centres located in regions with cooler climates and access to abundant renewable energy will naturally have lower environmental footprints. Companies must also consider scheduling AI training processes during off-peak times or in less water-stressed periods to reduce their overall resource impact.
At fluo we’re committed to staying ahead of these challenges. By leveraging cutting-edge technologies and partnering with industry leaders, we strive to minimise the environmental impact of our AI solutions. We recognise that the future of AI innovation depends not only on pushing the boundaries of technology, but also on ensuring that this growth is sustainable.
Balancing Innovation & Responsibility
AI is undeniably transforming the digital landscape, but with great power comes great responsibility. The energy and water demands of AI data centres pose significant environmental challenges that must be addressed.
By adopting smarter, more sustainable practices, along with staying at the forefront of technological advancements, we can lead the charge toward a future where AI and environmental responsibility go hand in hand. As the race to power AI continues, companies that prioritise sustainability will be the ones to shape the future – responsibly and effectively!
Sources:
https://blog.veoliawatertechnologies.co.uk/the-water-footprint-of-ai-data-centres
https://www.semianalysis.com/p/ai-datacenter-energy-dilemma-race
Ready to get started?

