'Totally Insane': Sam Altman Dismisses Claims Of ChatGPT Using 17 Gallons Of Water Per Query

While dismissing criticism over the massive water consumption by AI data centres, Altman noted that concerns about the technology’s energy requirements are legitimate.

Advertisement
Read Time: 3 mins
Altman noted that concerns about AI's energy requirements are legitimate.

Sam Altman, the CEO of OpenAI, has dismissed criticism over the artificial intelligence's environmental impact, pushing back against claims that data centres place excessive strain on water resources. He also drew parallels between the energy consumption of AI systems and that of human activity.

Altman made the remarks during an interview with The Indian Express on the sidelines of the recently concluded India AI Impact summit.

Advertisement

At the Express Adda event on Feb. 20, he dismissed concerns about the water requirements of data centres and said, “Water is totally fake. It used to be true. We used to do evaporative cooling in data centres, but now … we don't do that.”

“You see things on the internet, (like): ‘Don't use ChatGPT. It's 17 gallons of water for each query or whatever.' This is completely untrue – totally insane. No connection to reality,” he added.

Advertisement

ALSO READ: 'LEARN, LEARN, LEARN': Vishal Sikka Issues Urgent AI Call To Indian Workforce

According to CNBC, data centres have long been heavy consumers of water, using it to keep servers and electrical systems from overheating. However, advances in cooling design mean that a growing number of new facilities are now operating without any reliance on water-based systems.

However, efficiency gains may not be enough to curb overall demand. A report published last month by Xylem in collaboration with Global Water Intelligence warned that water use for cooling could increase more than threefold over the next 25 years, driven by the rapid expansion of computing, placing a growing strain on water infrastructure.

Advertisement

Altman noted that concerns about AI's energy requirements are legitimate. “Not per query, but in total – because the world is using so much AI ... and we need to move towards nuclear or wind and solar very quickly,” he said.

He also went on to compare the energy requirements of AI models with those of human beings.

“One of the things that is always unfair in this comparison is that people talk about how much energy it takes to train an AI model ... But it also takes a lot of energy to train a human. It takes like 20 years of life, and all the food you eat before that time, before you get smart,” the OpenAI CEO said.

“The fair comparison is if you ask ChatGPT a question, how much energy does it take once a model is trained to answer that question, versus a human, and probably AI has already caught up on an energy efficiency basis, measured that way,” he added.

The comments by Altman have triggered a debate on social media, with many users criticising the OpenAI CEO. 

One user said, “That is dystopian. It makes human development sound like a bug in the system, and it makes sacrificing human and creational flourishing for more computational power sound logical.”

Advertisement


ALSO READ: Anthropic Accuses DeepSeek, Moonshot And MiniMax Of Stealing Claude's Capabilities

Another comment read, “This analogy, however, dangerously reduces the human spirit to a mechanical balance sheet of calories and time. True intelligence blooms from existential struggles, emotional landscapes, and societal bonds experiences no algorithm can replicate or quantify.”

Data centres consumed roughly 1.5% of the world's electricity in 2024, according to the International Energy Agency. It estimates that power demand from these facilities will rise by around 15% annually through to 2030: a pace more than four times faster than electricity use across the rest of the economy.

Essential Business Intelligence, Continuous LIVE TV, Sharp Market Insights, Practical Personal Finance Advice and Latest Stories — On NDTV Profit.

Loading...