Earlier this month, OpenAI announced the release of GPT-4o. This AI model promises to be faster and more “human” than its predecessors, adding more value than ever before. AI has become the leading force of innovation. Yet, beneath the surface lies a significant challenge - the heavy consumption of water when cooling the data centers.
Research has indicated it takes approximately one 16.9oz bottle of water for ChatGPT to answer 50 questions. According to OpenAI CEO Sam Altman, ChatGPT now has 100 million active users weekly. That is a staggering amount of water! A single data center can consume millions of gallons of water per year. For instance, Google's data centers used approximately 4.3 billion gallons of water in 2021. To put this in perspective, that's enough to fill over 6,500 Olympic-sized swimming pools. The water is typically sourced from local municipalities, groundwater, or even man- made lakes and ponds, which can put a strain on local water supplies, especially in areas vulnerable to drought.
The amount of water used to run AI servers can vary depending on factors such as the type of cooling systems used, the weather at the location of the data center, and the efficiency of the server infrastructure. Water is primarily used for cooling the data centers to dissipate the heat generated by servers. There are several different methods that can be used. Direct liquid cooling involves circulating water directly to cool the heat-generating components inside the servers. Another is indirect liquid cooling, where water is used to cool a separate heat exchanger, which then cools the servers indirectly.
Some data centers use cooling towers that utilize water evaporation to dissipate heat from the servers. Water is then consumed in the evaporation process. The amount of water used in these methods will depend on the specific design and efficiency of the cooling system.
"Our goal is to become 'water positive' by 2030. We’re working to replenish more water than we consume globally and put back more water than we use."
-Satya Nadella (CEO of Microsoft)
Overall, water usage in data centers, including those hosting AI servers, remains a significant concern. To mitigate these impacts, the tech industry is taking steps to find sustainable solutions. Innovations such as liquid immersion cooling, which uses dielectric fluids instead of water, are being developed to reduce water consumption. Implementing water recycling systems within data centers can significantly cut down on water use. For example, Microsoft's data center in Quincy, Washington, has reduced its water usage by reusing water multiple times before discharge. Building data centers in cooler climates can reduce the need for water-intensive cooling systems, leveraging naturally lower temperatures to maintain optimal server conditions. Governments and industry leaders are also starting to introduce guidelines and standards for water usage in data centers to encourage more responsible water management practices.
As AI continues to drive the digital revolution, the environmental impact of its infrastructure must be addressed. In the end, the future of AI and data centers should be about balance—enjoying the benefits of technology while taking care of our planet. By making thoughtful choices now, we can ensure that our technological advances don’t come at the expense of our natural resources. Collaboration between the tech industry, governments, and environmental organizations is essential.
Policymakers can shape the future by enacting regulations that set clear standards for water usage in data centers, ensuring accountability and promoting best practices across the industry.