Skip to main content

Sam Altman claims an average ChatGPT query uses ‘roughly one fifteenth of a teaspoon’ of water

Altman shared the unsourced statistic in a new blog post.

Altman shared the unsourced statistic in a new blog post.

Photo collage of Sam Altman.
Photo collage of Sam Altman.
Image: Cath Virginia / The Verge; Getty Images
Jay Peters
is a news editor covering technology, gaming, and more. He joined The Verge in 2019 after nearly two years at Techmeme.

OpenAI CEO Sam Altman, in a blog post published Tuesday, says an average ChatGPT query uses about 0.000085 gallons of water, or “roughly one fifteenth of a teaspoon.” He made the claim as part of a broader post on his predictions about how AI will change the world.

“People are often curious about how much energy a ChatGPT query uses; the average query uses about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes,” he says. He also argues that “the cost of intelligence should eventually converge to near the cost of electricity.” OpenAI didn’t immediately respond to a request for comment on how Altman came to those figures.

AI companies have come under scrutiny for energy costs of their technology. This year, for example, researchers forecast that AI could consume more power than Bitcoin mining by the end of the year. In an article last year, The Washington Post worked with researchers to determine that a 100-word email “generated by an AI chatbot using GPT-4” required “a little more than 1 bottle.” The publication also found that water usage can depend on where a datacenter is located.

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.