Skip to content
Tiatra, LLCTiatra, LLC
Tiatra, LLC
Information Technology Solutions for Washington, DC Government Agencies
  • Home
  • About Us
  • Services
    • IT Engineering and Support
    • Software Development
    • Information Assurance and Testing
    • Project and Program Management
  • Clients & Partners
  • Careers
  • News
  • Contact
 
  • Home
  • About Us
  • Services
    • IT Engineering and Support
    • Software Development
    • Information Assurance and Testing
    • Project and Program Management
  • Clients & Partners
  • Careers
  • News
  • Contact

Sam Altman is right about the ‘fake’ AI water usage claims — but CIOs still have a massive sustainability problem

If you’ve been following the ongoing debate about AI’s environmental footprint, you may have come across clips from Sam Altman’s keynote interview at The Indian Express AI Summit on February 20, 2026, where he addressed growing concerns about the environmental footprint of ChatGPT and AI data centers. 

When asked about “the amount of natural resources going into data centers, the amount of water,” Altman responded bluntly: 

“Water is totally fake… You see these things on the internet like, ‘Don’t use ChatGPT, it’s 17 gallons of water per query.’ This is completely untrue. Totally insane. No connection to reality.” 

He explained that while older data centers used evaporative cooling, many modern facilities no longer rely on those methods in the same way. He also rejected widely circulated comparisons about energy usage per query. When the interviewer referenced claims that one ChatGPT query consumes the equivalent of “10 iPhones worth of battery” — or even “one or one-and-a-half iPhones” — Altman replied: 

“There’s no way it’s anything close to that much…Way, way, way less.” 

He went further, arguing that comparisons between AI training and human intelligence are often unfair: 

“One of the things that is unfair is that people talk about how much energy it takes to train an AI model relative to how much it costs a human to do one inference query. But it also takes a lot of energy to train a human… It takes like 20 years of life and all of the food you eat during that time.” 

He suggested that a fairer comparison is the energy required for an AI inference query versus the energy required to ‘train’ a human, which, he noted, takes “20 years of life and all of the food you eat during that time.” By that logic, he argued, AI may have already achieved energy parity with human cognition. 

I actually agree with part of his pushback. Some of the per-query numbers circulating online feel inflated, and they often rely on outdated cooling assumptions or overly simplified analogies. When he shifts the conversation from per-query sensationalism to total system load, he’s finally talking in terms CIOs understand — infrastructure and scale. For CIOs, this shift toward measuring total energy demand as AI scales is both correct and essential. 

Where I think his argument loses focus is when the conversation turns philosophical, comparing machine intelligence to human biology. When the comparison expands to include the 20-year energy cost of raising a human, the discussion moves away from measurable infrastructure and into territory that enterprise leaders cannot practically govern or measure. CIOs aren’t tasked with defending energy consumption in abstract philosophical terms. They’re responsible for managing the real, measurable footprint of the systems they run. 

In the same exchange, Altman acknowledged that total energy consumption from AI is real: 

“What is fair though, is the energy consumption — not per query, but in total — because the world is now using so much AI.” 

That distinction is where CIOs should focus. 

The strategic question is not whether a single ChatGPT query consumes 17 gallons of water. As Altman himself acknowledged, the real issue is not the per-query footprint, but the aggregate energy consumption as the world increasingly relies on AI at scale. In fact, sustainability teams working with enterprises are already seeing the effects of that growth — some organizations report AI-related infrastructure costs and emissions doubling month-over-month as experimentation and pilots expand. 

There is another dimension to this conversation that rarely shows up in viral AI debates: the lifecycle footprint of the infrastructure itself. Training and running modern AI systems rely on specialized chips built with rare metals and manufactured through extremely resource-intensive semiconductor processes. As AI demand accelerates, organizations are also cycling through hardware generations faster than traditional enterprise infrastructure lifecycles. That raises a separate sustainability question around circularity — how quickly AI hardware becomes obsolete, how it is reused or recycled, and what the long-term material footprint of this infrastructure looks like. 

AI isn’t just another application feature layered on top of existing systems. In most enterprises, it behaves more like an infrastructure multiplier. 

As organizations scale generative AI, they increase compute density, expand storage, move larger volumes of data, retrain models more frequently and maintain always-on inference environments. Those decisions influence power demand, cooling requirements and hardware refresh cycles. They are architectural choices — and architectural choices carry sustainability implications. 

In many organizations, sustainability still sits in an ESG reporting lane, while AI lives in innovation or digital transformation teams. AI deployment intersects with infrastructure strategy, procurement, cloud placement and operating model design. Sustainability outcomes are shaped across all of those domains. 

Data management is not the only lever — but in my experience, it’s one of the most overlooked. 

Governance decisions directly affect the infrastructure footprint. Poor data hygiene drives unnecessary compute. Duplicate datasets inflate storage demand. Over-retention policies increase long-term infrastructure load. Uncurated training corpora expand model size and retraining frequency. Inefficient data pipelines require additional processing cycles. These are operational governance issues — and collectively they shape aggregate energy demand. 

Other levers matter equally: model selection, workload placement in regions with different grid mixes, hardware efficiency, vendor sustainability commitments and procurement strategy. AI sustainability doesn’t belong to one team. It emerges from how well architecture, procurement, governance and infrastructure decisions are coordinated. 

Altman is correct that exaggerated metrics distort the conversation. However, focusing solely on whether a figure is ‘totally fake’ risks narrowing the discussion too far. If sustainability planning depends on viral analogies or clever rhetorical comparisons, we’re asking the wrong questions. What matters is aggregate measurement, lifecycle transparency and disciplined governance. 

AI strategy is now infrastructure strategy. Infrastructure strategy is inseparable from sustainability strategy. And sustainability, at its core, is a governance discipline. 

The companies that stand out won’t just be the ones that adopted AI fastest. They’ll be the ones that scaled it thoughtfully — with transparency, architectural rigor and data discipline built in from the start. 

This article was made possible by our partnership with the IASA Chief Architect Forum. The CAF’s purpose is to test, challenge and support the art and science of Business Technology Architecture and its evolution over time as well as grow the influence and leadership of chief architects both inside and outside the profession. The CAF is a leadership community of the IASA, the leading non-profit professional association for business technology architects. 

This article is published as part of the Foundry Expert Contributor Network.
Want to join?


Read More from This Article:
Sam Altman is right about the ‘fake’ AI water usage claims — but CIOs still have a massive sustainability problem
Source: News

Category: NewsMarch 31, 2026
Tags: art

Post navigation

PreviousPrevious post:How CIOs can help set the course toward a bright futureNextNext post:AI 구조조정 신호탄 되나…경제학자 “업무 분리 쉬운 직무일수록 더 위험”

Related posts

The biggest mistakes CIOs make in the boardroom — and how to avoid them
May 15, 2026
What is CMMI? A model to optimize development processes
May 15, 2026
How AI is transforming software development
May 15, 2026
From cautious to scaling: SAP customers span the AI readiness spectrum
May 15, 2026
AI 시대 CIO, ‘생존 시험대’ 올랐다…조직 혁신·AI 역량이 성패 좌우
May 15, 2026
앤트로픽, 클로드 에이전트 과금 전환…‘무제한 AI’ 시대 막 내리나
May 15, 2026
Recent Posts
  • What is CMMI? A model to optimize development processes
  • The biggest mistakes CIOs make in the boardroom — and how to avoid them
  • How AI is transforming software development
  • From cautious to scaling: SAP customers span the AI readiness spectrum
  • AI 시대 CIO, ‘생존 시험대’ 올랐다…조직 혁신·AI 역량이 성패 좌우
Recent Comments
    Archives
    • May 2026
    • April 2026
    • March 2026
    • February 2026
    • January 2026
    • December 2025
    • November 2025
    • October 2025
    • September 2025
    • August 2025
    • July 2025
    • June 2025
    • May 2025
    • April 2025
    • March 2025
    • February 2025
    • January 2025
    • December 2024
    • November 2024
    • October 2024
    • September 2024
    • August 2024
    • July 2024
    • June 2024
    • May 2024
    • April 2024
    • March 2024
    • February 2024
    • January 2024
    • December 2023
    • November 2023
    • October 2023
    • September 2023
    • August 2023
    • July 2023
    • June 2023
    • May 2023
    • April 2023
    • March 2023
    • February 2023
    • January 2023
    • December 2022
    • November 2022
    • October 2022
    • September 2022
    • August 2022
    • July 2022
    • June 2022
    • May 2022
    • April 2022
    • March 2022
    • February 2022
    • January 2022
    • December 2021
    • November 2021
    • October 2021
    • September 2021
    • August 2021
    • July 2021
    • June 2021
    • May 2021
    • April 2021
    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    Categories
    • News
    Meta
    • Log in
    • Entries feed
    • Comments feed
    • WordPress.org
    Tiatra LLC.

    Tiatra, LLC, based in the Washington, DC metropolitan area, proudly serves federal government agencies, organizations that work with the government and other commercial businesses and organizations. Tiatra specializes in a broad range of information technology (IT) development and management services incorporating solid engineering, attention to client needs, and meeting or exceeding any security parameters required. Our small yet innovative company is structured with a full complement of the necessary technical experts, working with hands-on management, to provide a high level of service and competitive pricing for your systems and engineering requirements.

    Find us on:

    FacebookTwitterLinkedin

    Submitclear

    Tiatra, LLC
    Copyright 2016. All rights reserved.