AI’s Growth Raises Concerns over Water Consumption in Data Centers


TEHRAN (Tasnim) – As artificial intelligence advances rapidly, the environmental impact of the technology is increasingly under scrutiny, particularly its high demands on water and energy resources in data centers.

Once a subject of science fiction, artificial intelligence (AI) has become a regular topic of conversation over the past two years. This shift has brought widespread attention to the environmental challenges associated with AI, especially the large amounts of water and energy required to train and run these systems.

A recent report highlighted that water consumption in Northern Virginia’s data centers—the largest cluster in the world—has risen by two-thirds in the past five years.

Researchers estimate that each interaction with AI models like ChatGPT requires the equivalent of a 500 ml bottle of water for a conversation of 20-50 questions, depending on where and when the AI is deployed.

Moreover, the report focused on GPT-3, which has 175 billion parameters. The newer GPT-4 model, with 1.7 to 1.8 trillion parameters, could potentially use even more resources. According to OpenAI's Trevor Cai, models are only expected to grow larger.

Although this raises concerns about power consumption in data centers, the same might not necessarily be true for water use.

It’s important to clarify that data centers don’t "consume" water in the traditional sense. Rather, water is removed from the local environment and not returned. Also, the AI infrastructure itself doesn't consume water directly; air handling systems, specifically evaporative or "swamp" coolers, are primarily responsible for the high water usage in data centers.

These evaporative coolers prevent servers from overheating by using water to cool the air. However, their use is a design choice and not universal. In colder regions, dry cooling systems or "free cooling" can be more suitable, while refrigerant-based systems are often favored in warmer, drought-prone areas.

For example, Microsoft has opted for refrigerant-based cooling in its Arizona data center after resolving a wastewater dispute with the city of Goodyear.

Alternative cooling methods, though available, usually result in higher energy consumption, which is a growing concern as data centers already face power shortages.

When it comes to new data centers, the decision to use water-intensive evaporative cooling is often driven by financial considerations. Hyperscale companies tend to prioritize cost-efficiency, and water is highly effective at removing heat, resulting in reduced electricity costs and allowing for denser facility builds in areas with limited power supplies.

In areas where water is cheaper and more accessible, such as the Great Lakes region, data center operators are likely to opt for evaporative cooling.

However, the increasing scale of AI models is pushing for new cooling technologies. Nvidia’s Grace Blackwell Superchips, for example, require direct liquid cooling (DLC) due to their high energy demands. While DLC is more energy-efficient than traditional fan systems, it poses challenges for data centers not equipped for retrofitting.

Despite these challenges, the shift to liquid cooling could eventually reduce water usage. Dry coolers, used in conjunction with DLC, can be more efficient than evaporative coolers. Additionally, the potential to reuse heat from liquid-cooled systems presents an opportunity for sustainability, as demonstrated in hypothetical models where AI training could support agricultural operations like greenhouses.

However, until liquid cooling becomes more widespread, concerns about data center water consumption are likely to persist.