The energy requirements to train AI models have increased substantially as these models have grown more sophisticated and display an improved capability for reasoning, problem solving and accuracy, according to the Stanford Emerging Technology Review 2026.
"Given hundreds of millions of queries per day, the operating energy requirement of ChatGPT might be a few hundred thousand kilowatt-hours per day, at a cost of several tens of thousands of dollars," the report said.
Large language models like ChatGPT can consume notable amounts of power not just when answering queries but also during its training phase. According to the report, the electricity required to train its foundational model GPT-4 was 50 million kilowatt hours. To put this into perspective, the average household in the US consumes 11,000 kWh a year. It indicated that the AI model required enough energy to power an average of 4,500 homes per year in order to receive training, which is expensive and raises questions about its impact on the environment.
"Paying for this energy adds significant cost and raises environmental concerns, even before a single person actually uses a model," the report said.
ALSO READ: AI Ahead: Why Efficient, Open Models Can Beat Mega-Labs
This energy cost does not end at the training phase as once a model is ready for use, the cost of energy required to answer every queries tallies up quickly. According to the report, ChatGPT used close to 0.002 of a kilowatt-hour, or 2 watt-hours (Wh) of energy used per query, the same amount of energy stored in a single alkaline AAA battery. To explain the scale of energy consumption with a comparison, a single Google search uses about 0.3 Wh, which is about 566% more energy.
With rapid advancements in AI and the prevalence of 'reasoning models', the energy costs have further ramped up exponentially in the past year.
"With the recent focus on what are called reasoning models—foundation models that seemingly “think” through problems step by step before presenting the user with an output—such inference costs have substantially increased in the past year," the report said.
The hardware requirements for computing power are also similarly steep. The report stated that it took close to 25,000 Nvidia A100 GPU deep-learning chips at a cost of $10,000 each running for about 100 days to train GPT-4.
"Including both these chips and other hardware components used, the overall hardware costs for GPT-4 were at least a few hundred million dollars. And the chips underlying this hardware are specialty ones often fabricated offshore," the report said.
ALSO READ: Happiest Minds CEO Bets On Automation-Led Growth, Says AI Won't Shrink Workforce
Essential Business Intelligence, Continuous LIVE TV, Sharp Market Insights, Practical Personal Finance Advice and Latest Stories — On NDTV Profit.