AI energy efficiency comparisons ‘unfair’ bleats Sam Altman, citing amount of energy needed to evolve, then train a human — one ‘takes like 20 years of life and all of the food you eat during that time before you get smart’ he argues
‘AI has already caught up… measured that way’ asserts the AI mogul.
Get 3DTested's best news and in-depth reviews, straight to your inbox.
You are now subscribed
Your newsletter sign-up was successful
OpenAI CEO Sam Altman took part in a wide ranging Q&A on Friday, answering dozens of rapid-fire questions during a 60 minute session hosted by The Indian Express. Not for the first time, Altman stoked controversy. This time, he bemoaned “unfair” comparisons between the efficiency of AI inference queries and human thought. In Altman’s view the comparison is skewed as humans have millennia of evolutionary smarts and technology teachings behind them, yet individuals require “like 20 years of life and all of the food you eat during that time before you get smart.”
🚨 SAM ALTMAN: “People talk about how much energy it takes to train an AI model … But it also takes a lot of energy to train a human. It takes like 20 years of life and all of the food you eat during that time before you get smart.” Pic.twitter.com/vRuVnnmzjB February 21, 2026
Chief Nerd clipped the eyebrow-raising Q&A segment for convenient sharing.
In the above video segment, the AI business torchbearer begins by stating “One of the things that is always unfair in this comparison is people talk about how much energy it takes to train an AI model relative to how much it costs a human to do one inference query.” But, according to Altman, it also takes a lot of energy to train a human.
“It takes like 20 years of life and all of the food you eat during that time before you get smart,” the OpenAI CEO said to the assembled audience awaiting gems of wisdom. Moreover, Altman wants to roll in the “evolution of the hundred billion people,” and humanity’s progress to “not to get eaten by predators and learn how to like figure out science and whatever,” into the equation. If we did that calculation, Altman appears to reason, “probably AI has already caught up on an energy efficiency basis… Measured that way.”
OpenAI tech also evolved - from the minds and technological feats of humans
We see a few leaps in Altman's expanded-timeline human vs AI efficiency comparison logic, that need to be addressed. For example, shouldn’t the AI computing world also roll in the prior ‘energy cost’ of human evolution, the Renaissance, and so on? Aliens didn’t provide the blueprints for ENIAC.
Some commentators have also argued that Altman is dehumanizing by reducing childhood, learning, and growth to their energy inputs. Others even wonder if Altman would prefer to see resources diverted from human to machine intelligence.
However, beyond the confines of this Tweet clip, to give it more context and be fairer to the OpenAI boss, he also takes the time to push for more sustainable energy solutions. Tapping more into sustainable resources would take massive consumers like OpenAI a little more out of the firing line as scarce resource competitors, as folks’ utility bills inch higher and higher.
Get 3DTested's best news and in-depth reviews, straight to your inbox.
The above Q&A took place in the wake of Altman, and other AI high rollers, meeting with PM Narendra Modi during a highly publicized week that underscored India’s importance as an AI growth engine.
Follow 3DTested on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.

-
Notton By that argument, I could argue that OpenAI's energy consumption has caused humans to be starved and devolve into talking like an AI chatbot.Reply
And it took less than 10 years. -
coolitic That has to be the most ridiculous set of mental gymnastics that I've seen in a while.Reply -
kva13 Since AI is so efficient next logical conclusion will be to slaughter all humans and leave super efficient AI.Reply
Seen it somewhere....