Popular Posts

Sam Altman Dispels AI Water Usage Myths, Urges Renewable Energy Transition for Growing AI Demands

OpenAI CEO Sam Altman recently addressed pressing concerns surrounding the environmental footprint of artificial intelligence, particularly its water and energy consumption, during an insightful discussion hosted by The Indian Express. Altman’s remarks came as he participated in a significant AI summit in India, providing a global platform for his perspectives on the rapidly evolving technology and its societal implications. His intervention aimed to clarify widespread misconceptions while acknowledging legitimate environmental challenges posed by AI’s exponential growth.

During the event, Altman unequivocally dismissed claims regarding AI’s current water usage, branding them as "totally fake." He specifically targeted narratives suggesting exorbitant water consumption per AI query, such as the widely circulated figure of "17 gallons of water for each query," which he labeled "completely untrue, totally insane, no connection to reality." Altman clarified that significant water usage, primarily through evaporative cooling in data centers, was indeed a "real issue" in the past. However, he asserted that modern data center practices have largely moved away from such water-intensive cooling methods, rendering current concerns about AI’s water consumption largely unfounded. This shift in cooling technology, while not explicitly detailed, implies advancements in air-cooling, liquid cooling, or other more efficient thermal management systems that minimize water dependency. The CEO’s strong rebuttal underscored a perceived disconnect between public discourse and the current operational realities of advanced AI infrastructure, highlighting the need for accurate information in public debates about technology’s environmental impact.

While refuting the severity of AI’s water footprint, Altman readily conceded that the overall energy consumption of AI is a "fair" and significant concern. He emphasized that this concern stems not from the energy required "per query," but from the aggregate demand across the entire AI ecosystem. "The world is now using so much AI," Altman observed, pointing to the burgeoning deployment of AI technologies across industries, research, and daily life. This widespread adoption, he argued, necessitates a rapid global transition towards sustainable energy sources. Altman advocated for an accelerated shift "towards nuclear or wind and solar very quickly" to power the increasingly energy-intensive AI infrastructure. His call for renewable and clean energy solutions underscores the urgent need for strategic planning and investment in sustainable power generation to support technological advancement without exacerbating climate change.

The debate surrounding AI’s environmental impact is further complicated by a lack of transparency within the tech industry. Currently, there is no legal mandate for technology companies to publicly disclose their energy and water consumption data. This absence of mandatory reporting has compelled independent scientists and researchers to undertake their own studies to estimate AI’s environmental impact. These independent investigations often rely on indirect methods, such as analyzing publicly available information on data center construction, server specifications, and energy grid capacities, to model the resource demands of large-scale AI operations. Moreover, the rapid expansion of data centers, driven in part by AI’s demands, has been linked to rising electricity prices in various regions, putting a strain on existing energy grids and raising questions about energy equity and infrastructure resilience. These independent efforts are crucial for providing an unbiased understanding of AI’s environmental footprint in the absence of corporate disclosure.

During the interview, Altman also addressed a specific, widely cited comparison, reportedly stemming from a conversation with Bill Gates, which posited that a single ChatGPT query consumes the energy equivalent of 1.5 iPhone battery charges. Altman unequivocally dismissed this comparison, stating, "There’s no way it’s anything close to that much." This denial reflects a broader frustration among AI developers regarding what they perceive as oversimplified or exaggerated metrics used in public discussions about AI’s energy usage. Such comparisons, while catchy, often fail to account for the complexities of energy consumption in large-scale computing systems, potentially leading to misinformed public perceptions.

Altman further expressed his view that many discussions surrounding ChatGPT’s energy usage are "unfair," particularly when they draw comparisons between the energy expended to "train an AI model" and the energy cost for a human to perform "one inference query" (i.e., answer a single question). He argued that such a comparison overlooks the immense, often unquantified, "energy" investment required to develop human intelligence.

Elaborating on this provocative comparison, Altman presented a philosophical argument about the energy cost of human intelligence. "It also takes a lot of energy to train a human," he stated, highlighting the "20 years of life and all of the food you eat during that time before you get smart." He extended this analogy further, encompassing the vast evolutionary history that has shaped human cognition: "it took the very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever, to produce you." This expansive view suggests that the "training cost" for human intelligence is an astronomical, cumulative expenditure of biological and environmental resources over millennia.

In light of this, Altman proposed what he considers a more "fair comparison": assessing "how much energy does it take once its model is trained to answer that question versus a human?" When viewed through this lens, comparing the energy for an AI model to provide an inference (answer a query) post-training versus a human performing the same task, Altman believes, "AI has already caught up on an energy efficiency basis, measured that way." This perspective shifts the focus from the colossal energy investment in creating an AI model to its operational efficiency once it is developed, suggesting that AI could, in the long run, offer a more energy-efficient means of knowledge processing and problem-solving compared to human cognitive effort, especially when considering the "training" investment for humans.

These discussions regarding AI’s environmental implications are increasingly prevalent across the technology landscape. For instance, TechCrunch, a leading technology news outlet, frequently hosts and reports on industry events that delve into these critical topics.

Techcrunch event

Boston, MA
|
June 9, 2026

The comprehensive interview with Sam Altman, covering ChatGPT, AI risks, and future developments, provides valuable insights into the ongoing dialogue. The specific segment addressing water and energy usage begins approximately 26 minutes and 35 seconds into the full video, available for public viewing.

Anthony Ha, TechCrunch’s weekend editor, authored the original report. Ha, who previously served as a tech reporter at Adweek, a senior editor at VentureBeat, a local government reporter at the Hollister Free Lance, and vice president of content at a VC firm, is based in New York City. He can be contacted via email at [email protected] for inquiries or verification of outreach. His comprehensive coverage contributes to the ongoing discourse surrounding artificial intelligence and its multifaceted impact on society and the environment.

Leave a Reply

Your email address will not be published. Required fields are marked *