If you’ve been following the ongoing debate about AI’s environmental footprint, you may have come across clips from Sam Altman’s keynote interview at The Indian Express AI Summit on February 20, 2026, where he addressed growing concerns about the environmental footprint of ChatGPT and AI data centers.
When asked about “the amount of natural resources going into data centers, the amount of water,” Altman responded bluntly:
“Water is totally fake… You see these things on the internet like, ‘Don’t use ChatGPT, it’s 17 gallons of water per query.’ This is completely untrue. Totally insane. No connection to reality.”
He explained that while older data centers used evaporative cooling, many modern facilities no longer rely on those methods in the same way. He also rejected widely circulated comparisons about energy usage per query. When the interviewer referenced claims that one ChatGPT query consumes the equivalent of “10 iPhones worth of battery” — or even “one or one-and-a-half iPhones” — Altman replied:
“There’s no way it’s anything close to that much…Way, way, way less.”
He went further, arguing that comparisons between AI training and human intelligence are often unfair:
“One of the things that is unfair is that people talk about how much energy it takes to train an AI model relative to how much it costs a human to do one inference query. But it also takes a lot of energy to train a human… It takes like 20 years of life and all of the food you eat during that time.”
He suggested that a fairer comparison is the energy required for an AI inference query versus the energy required to ‘train’ a human, which, he noted, takes “20 years of life and all of the food you eat during that time.” By that logic, he argued, AI may have already achieved energy parity with human cognition.
I actually agree with part of his pushback. Some of the per-query numbers circulating online feel inflated, and they often rely on outdated cooling assumptions or overly simplified analogies. When he shifts the conversation from per-query sensationalism to total system load, he’s finally talking in terms CIOs understand — infrastructure and scale. For CIOs, this shift toward measuring total energy demand as AI scales is both correct and essential.
Where I think his argument loses focus is when the conversation turns philosophical, comparing machine intelligence to human biology. When the comparison expands to include the 20-year energy cost of raising a human, the discussion moves away from measurable infrastructure and into territory that enterprise leaders cannot practically govern or measure. CIOs aren’t tasked with defending energy consumption in abstract philosophical terms. They’re responsible for managing the real, measurable footprint of the systems they run.
In the same exchange, Altman acknowledged that total energy consumption from AI is real:
“What is fair though, is the energy consumption — not per query, but in total — because the world is now using so much AI.”
That distinction is where CIOs should focus.
The strategic question is not whether a single ChatGPT query consumes 17 gallons of water. As Altman himself acknowledged, the real issue is not the per-query footprint, but the aggregate energy consumption as the world increasingly relies on AI at scale. In fact, sustainability teams working with enterprises are already seeing the effects of that growth — some organizations report AI-related infrastructure costs and emissions doubling month-over-month as experimentation and pilots expand.
There is another dimension to this conversation that rarely shows up in viral AI debates: the lifecycle footprint of the infrastructure itself. Training and running modern AI systems rely on specialized chips built with rare metals and manufactured through extremely resource-intensive semiconductor processes. As AI demand accelerates, organizations are also cycling through hardware generations faster than traditional enterprise infrastructure lifecycles. That raises a separate sustainability question around circularity — how quickly AI hardware becomes obsolete, how it is reused or recycled, and what the long-term material footprint of this infrastructure looks like.
AI isn’t just another application feature layered on top of existing systems. In most enterprises, it behaves more like an infrastructure multiplier.
As organizations scale generative AI, they increase compute density, expand storage, move larger volumes of data, retrain models more frequently and maintain always-on inference environments. Those decisions influence power demand, cooling requirements and hardware refresh cycles. They are architectural choices — and architectural choices carry sustainability implications.
In many organizations, sustainability still sits in an ESG reporting lane, while AI lives in innovation or digital transformation teams. AI deployment intersects with infrastructure strategy, procurement, cloud placement and operating model design. Sustainability outcomes are shaped across all of those domains.
Data management is not the only lever — but in my experience, it’s one of the most overlooked.
Governance decisions directly affect the infrastructure footprint. Poor data hygiene drives unnecessary compute. Duplicate datasets inflate storage demand. Over-retention policies increase long-term infrastructure load. Uncurated training corpora expand model size and retraining frequency. Inefficient data pipelines require additional processing cycles. These are operational governance issues — and collectively they shape aggregate energy demand.
Other levers matter equally: model selection, workload placement in regions with different grid mixes, hardware efficiency, vendor sustainability commitments and procurement strategy. AI sustainability doesn’t belong to one team. It emerges from how well architecture, procurement, governance and infrastructure decisions are coordinated.
Altman is correct that exaggerated metrics distort the conversation. However, focusing solely on whether a figure is ‘totally fake’ risks narrowing the discussion too far. If sustainability planning depends on viral analogies or clever rhetorical comparisons, we’re asking the wrong questions. What matters is aggregate measurement, lifecycle transparency and disciplined governance.
AI strategy is now infrastructure strategy. Infrastructure strategy is inseparable from sustainability strategy. And sustainability, at its core, is a governance discipline.
The companies that stand out won’t just be the ones that adopted AI fastest. They’ll be the ones that scaled it thoughtfully — with transparency, architectural rigor and data discipline built in from the start.
This article was made possible by our partnership with the IASA Chief Architect Forum. The CAF’s purpose is to test, challenge and support the art and science of Business Technology Architecture and its evolution over time as well as grow the influence and leadership of chief architects both inside and outside the profession. The CAF is a leadership community of the IASA, the leading non-profit professional association for business technology architects.
This article is published as part of the Foundry Expert Contributor Network.
Want to join?
Read More from This Article: Sam Altman is right about the ‘fake’ AI water usage claims — but CIOs still have a massive sustainability problem
Source: News

