Sam Altman Dismisses AI Water Usage Concerns, Compares Energy Costs to Human Training

Published on 24 February, 2026

OpenAI CEO Defends AI Resource Consumption


Sam Altman, the CEO of OpenAI, has pushed back against growing criticism regarding the environmental footprint of artificial intelligence. During a recent appearance at the India AI Impact summit, Altman addressed concerns about the resources required to power large language models, specifically dismissing reports regarding water consumption as unfounded.


Responding to claims that ChatGPT consumes significant amounts of water per query, Altman stated that such figures are "completely untrue" and disconnected from reality. While acknowledging that water usage was a larger issue during the era of evaporative cooling, he suggested that current technologies have mitigated these concerns, though he did not disclose specific usage statistics.


The Human vs. AI Energy Debate


Altman attempted to reframe the narrative around AI energy consumption by drawing a controversial comparison to human biology. He argued that the energy expenditure required to "train" a human—encompassing 20 years of food consumption and the evolutionary history of the species—is immense.


He posited that a fair comparison should measure the energy required to answer a query after the respective training phases are complete. From this perspective, Altman claimed that AI has likely already achieved superior energy efficiency compared to human labor. This comparison arrives amidst a backdrop of anxiety regarding AI's potential to displace jobs, a topic Altman has previously acknowledged.


Future Projections and Energy Shifts


Despite his defense of AI efficiency, Altman conceded that the escalating electricity demands of data centers are a valid concern. He emphasized the necessity for the energy sector to transition rapidly to sustainable sources such as nuclear, wind, and solar power to support the growing computational load.


Industry data supports the urgency of this transition. The International Energy Agency projects that global data center electricity usage could nearly double by 2030, largely driven by AI advancements. In the United States specifically, estimates suggest data centers could account for up to 12% of the nation's total electricity consumption by 2028. While Altman advocates for a shift in energy infrastructure, his comparison of biological and artificial resource consumption continues to spark debate within the tech community.

Comments

Leave a comment