Gridlocked: The Coming Power Struggle Between AI And Manufacturing
AI workloads, in contrast, are bursty, fast-scaling, and digitally nomadic, but they are increasingly vital for economic competitiveness and digital sovereignty.

In recent weeks, I’ve written extensively about data centres and the growing infrastructure behind AI. But there’s a part of this story that deserves urgent attention: The mounting pressure on global power grids and the looming conflict between energy for industrial manufacturing and energy for digital computing.
This is not a future risk. It is a present crisis in slow motion. At Greyhound Research, our latest work explores how hyperscale data centres and AI workloads are colliding head-on with industrial energy demand.
And the numbers are telling. Per a report by the International Energy Agency, in 2024, global data centres consumed an estimated 415 terawatt-hours of electricity, roughly 1.5% of total global consumption.
AI workloads alone drove half of all new power demand in this segment. At the same time, countries are trying to re-shore manufacturing, build semiconductor fabs, and electrify entire industrial sectors. These priorities are converging on the same limited infrastructure.
Unlike the past, where energy demands grew predictably, today’s spike in AI compute and industrial load is overwhelming grid capacity in regions from Northern Virginia and Dublin to parts of Asia.
It takes one to three years to build a data centre or factory. It takes five to fifteen years to build the power infrastructure that can support them. That mismatch is where the trouble begins. But this isn’t just about capacity. It’s about character.
Manufacturing and AI compute draw power in fundamentally different ways. Industrial loads are steady, location-bound, and politically strategic. Governments prioritise them because they create jobs, exports, and national capability.
AI workloads, in contrast, are bursty, fast-scaling, and digitally nomadic, but they are increasingly vital for economic competitiveness and digital sovereignty. So here’s the real issue: in a zero-sum moment, who gets the megawatts?
ALSO READ
Industry Consultations On Mandatory Labelling Of AI-Generated Content Over, Rules Soon: IT Secy
What we’re seeing globally is a quiet but rapid shift in how governments and utilities are answering that question. In Ireland, new data centres are now required to bring their own power. In parts of the US, industrial users are getting grid priority over AI clusters.
In China, entire "compute zones" are being shifted to less power-stressed provinces. In India, while no formal prioritisation yet exists, the first signs of competition are already visible.
The truth is, this is not a battle between good and bad demand. It’s a battle between predictable and unpredictable demand. Industrial energy loads are politically protected because they’re perceived as essential.
AI energy demand, no matter how strategic, is still seen as discretionary. That perception may not survive the next five years, but it’s shaping infrastructure decisions today.
ALSO READ
IBM Targets Training 5 Million Indian Youth In AI, Cybersecurity, Quantum Computing By 2030
Yet, this binary misses something big. AI is not only a consumer of energy. In industrial manufacturing, it is also becoming a driver of energy efficiency. A recent study in Energy Economics shows that Chinese manufacturing firms that adopted AI saw their energy intensity drop by over 13%, with even sharper reductions in private enterprises and those outside smart city zones.
In short, AI can help manufacturing firms produce more using less power. But this impact is only realised when AI is integrated into factory operations, not just backend server farms. The fact that AI drives energy demand while also enabling efficiency gains requires more thoughtful, layered responses from policymakers.
If AI helps reduce the energy intensity of manufacturing, it must be accounted for not just in kilowatts consumed but in kilowatts saved. When embedded into industrial processes, AI can optimise production scheduling, reduce resource waste, and even predict and prevent equipment failures.
These are not soft benefits. They are measurable gains with implications for national energy security and climate commitments. So, where does this leave us? For starters, we need to stop treating data centres like just another commercial facility.
If AI is going to be a national capability, its infrastructure needs to be treated as such. This means building policy frameworks that don’t just measure usage but also model flexibility, resilience, and strategic value.
We also need to stop thinking of this conflict as inevitable. Flexibility is a feature of AI infrastructure if we choose to design for it. Workload orchestration, real-time shifting, time-of-use responsiveness, and geographic flexibility are all viable tools. Google has already shown that AI training workloads can be paused or redirected to support grid stability.
It also means redefining what we ask of hyperscalers. The old playbook - procure renewable energy certificates and build wherever land is cheap - is no longer sufficient. AI leaders need to co-invest in energy infrastructure, partner with utilities on demand-response, and design systems that give back flexibility to the grid, not just take from it.
This also opens up a new opportunity for countries willing to act fast. Governments that can offer AI infrastructure a mix of clean power, predictable permitting, and demand-aware energy policies will become the next digital magnets. Just as semiconductor fabs seek water security and skilled labour, AI clusters will increasingly seek out energy intelligence.
We cannot overlook talent. The energy-aware AI developer is going to be just as critical as the system architect or the data scientist. As AI becomes more integrated into the grid itself, powering everything from fault detection to predictive load balancing, the workforce that builds and governs this interaction must evolve.
Universities, technical institutions, and internal capability programs must now bring energy fluency into the AI curriculum, just as they once brought cloud fluency into traditional IT. The energy allocation fight is coming. But it doesn’t have to be a zero-sum war.
If national planners, utilities, and hyperscalers get ahead of this now, we can build a model where AI innovation and industrial resurgence don’t cannibalise each other; they reinforce each other.
In a world where infrastructure is destiny, the grid has become the new geopolitical fault line. How we plan around it will define who leads, not just in tech or industry, but in everything that follows.
Disclaimer: The views expressed in this article are solely those of the author and do not necessarily reflect the opinion of NDTV Profit or its affiliates. Readers are advised to conduct their own research or consult a qualified professional before making any investment or business decisions. NDTV Profit does not guarantee the accuracy, completeness, or reliability of the information presented in this article.
