Chief Executive Officer of OpenAI, Sam Altman, has pushed back against growing concerns over the environmental impact of artificial intelligence, describing widely circulated claims about excessive water usage by AI systems as “completely untrue.”
Speaking at an event hosted by The Indian Express during a major AI summit in India, Altman said assertions that a single ChatGPT query consumes large volumes of water have no basis in current operational reality.
Altman acknowledged that water usage was once a legitimate issue when data centers relied on evaporative cooling systems, but stressed that such practices are no longer standard in many modern facilities. He criticised online claims suggesting that each ChatGPT query uses as much as 17 gallons of water, insisting that such figures are misleading and disconnected from how contemporary infrastructure operates. According to him, technological improvements have significantly reduced water dependency in AI processing environments.
While dismissing water related allegations, Altman conceded that concerns about total energy consumption are more reasonable, given the rapid global adoption of AI tools. He said the issue is not the energy cost of a single query but the cumulative demand as AI systems become deeply integrated into everyday life and business operations. In that context, he argued that the solution lies in accelerating the global transition toward cleaner energy sources such as nuclear, wind and solar power.
Addressing comparisons about energy use, Altman rejected suggestions that one ChatGPT query consumes energy equivalent to multiple smartphone battery charges. He described such estimates as exaggerated and criticised what he called unfair narratives that compare the energy required to train large AI models with the effort it takes a human to answer a single question. Drawing on a past conversation with Microsoft co founder Bill Gates, the interviewer had referenced claims about per query energy use, which Altman firmly disputed.
Altman further argued that a more balanced comparison would consider the energy required to train and sustain human intelligence over decades, including education and societal development, versus the marginal energy needed for an already trained AI model to generate a response. In his view, when measured on that basis, AI systems may already be approaching or surpassing human level energy efficiency per query.
His remarks come amid increasing scrutiny of data centers worldwide, as researchers continue independent studies into AI’s environmental footprint in the absence of mandatory disclosure requirements for tech companies.
