Keeping the water flowing - how the water industry is finding new ways to manage the AI revolution

Author: Anne-Marie Kirkman
AdobeStock_350462469_Working-data-centre-full-of-rack-servers-and-supercomputers

At a glance

Advances in technology and the artificial intelligence boom are causing a dramatic increase in the number of data centres being built globally. But while new centres are being announced on a weekly basis, the sheer amount of energy and water required to operate data centres is often underestimated by investors and operators. 
Advances in technology and the artificial intelligence boom are causing a dramatic increase in the number of data centres being built globally.

New processors for AI systems are generating far more heat than previous generations, pushing traditional water-based cooling systems beyond what they can handle. Methods that once sufficed are now plagued with inefficiencies, raising the risk of hardware failures and putting unnecessary pressure on precious drinking water reserves.

A single hyperscale data centre can consume 2 million litres of water a day and over a year, enough electricity annually to power 100,000 homes. The price tag can soar to US$20 billion for just one data centre and that’s just the beginning. Operators must also navigate a complex web of concerns – from water and energy management to finding optimal locations, considering land use and engaging with community interests.

Designing data centres with less water

The International Energy Agency estimates that data centres globally use more than 560 billion litres of water a year, a figure that could skyrocket to 1.2 trillion by 2030. This extraordinary thirst stems from both direct cooling needs and the indirect demands of chip manufacturing and power generation. 

Historically, data centres have relied on air-based cooling systems that discharge heat by evaporating water in cooling towers – systems that are notoriously water-intensive. 

As data centres scale to support AI, the question is no longer whether water use can be optimised - but whether water availability is considered early enough to shape cooling choices, site selection and long-term viability.

Fortunately, proven innovations are already changing what’s possible. Advances in chip manufacturing-enabling new technologies such as liquid immersion provide practical and tangible options to reduce water demand. While new planning strategies help balance the benefits of data centres with the realities faced by local communities when these resource-hungry utilities arrive.

 

Five design approaches reshaping the industry

These cooling approaches share a common advantage: they allow water demand to be reduced or virtually removed entirely - but only if they’re considered early in the design process.

  1. Targeted cooling: Focus where it’s created
    Instead of cooling entire buildings, leading companies are targeting heat where it’s generated. Specialised cooling systems attach directly to processors, achieving near-zero water use through closed-loop systems that only need minimal maintenance.
  2. Complete equipment immersion: Maximum heat removal
    Servers are completely submerged in specially designed fluids that don’t conduct electricity. While it sounds futuristic, this approach is already in place. It eliminates water evaporation entirely at the equipment level, with cooling fluids continuously recycled without needing any fresh water.
  3. Advanced refrigeration: High-density heat transfer
    By harnessing the phase-change properties of refrigerants, these systems transfer far more heat with far less fluid than water. Compact, sealed and water-free, they deliver high-performance cooling with less waste of traditional towers.
  4. Air-only cooling: Eliminating water dependence
    By transferring heat directly to outside air through specialised heat exchangers instead of cooling towers, these systems achieve zero water consumption. While they use more electricity, it removes the dependency on water in cooling systems.
  5. Water recycling systems: Capture and reuse
    Facilities can now capture and reuse water through advanced recycling, rainwater collection and treatment systems that significantly reduce freshwater needs. However, while recycling and reuse are essential, they are most effective when paired with fundamentally lower-water cooling systems, rather than used to justify high baseline demand.

Designing for water from the onset

Prioritising water as a vital business resource and implementing comprehensive planning should be fundamental to any building or construction initiative. Data centres are increasingly developing strategies to provide optimal outcomes from the outset:
 
  • Account for total water requirements: Quantify total water usage across all operational facets. These include direct facility consumption, electricity-related demands and the water utilised in manufacturing equipment. This comprehensive measurement reveals where resource investments will deliver the biggest returns.
  • Strategic site selection: Savvy location decisions incorporate an evaluation of water availability, favouring sites with reliable water resources and renewable energy sources that require minimal water to generate power. For example, when Meta built its first data centre outside of the US, it chose a site in Sweden known for its cooler climate and abundant hydroelectric capacity. Though remote sites can have drawbacks such as increased latency and the necessity for proactive community engagement. 
  • AI-driven operations: Leading organisations are leveraging artificial intelligence integrated with smart sensors to manage real-time temperature control, demand forecasting and system optimisation. These measures significantly reduce cooling requirements and preempt potential issues.
  • Phased implementation: Rather than constructing facilities to accommodate maximum projected demand from inception, successful projects phase in water increments over time. Collaboration with infrastructure authorities helps operational approvals to align with actual business expansion.
  • Working with your existing infrastructure: Replacing legacy infrastructure isn’t always necessary. Introducing new cooling technologies during scheduled upgrades allows for targeted improvements in high-demand areas, facilitating the adoption of advanced solutions without massive initial expenditure. 

The bottom line

Water availability is rapidly becoming a defining constraint for data centre development, rather than a secondary operational consideration. Low-water and water-free cooling technologies now provide viable alternatives to traditional cooling towers, but their advantages are only realised when water requirements are embedded early in site selection, design and approval processes.

By prioritising water availability from the outset and aligning cooling strategies accordingly, the industry can move beyond incremental efficiency gains toward more resilient, sustainable and community-compatible data centre development. In this context, water utilities also take on a broader role - not only as regulatory authorities, but as strategic planning partners supporting the next wave of AI infrastructure.

 

Author