For much of the 20th century, artificial intelligence (AI) struggled not because researchers lacked ambition, but because the hardware available to power it simply wasn’t powerful enough. Early AI systems hit hard limits on processing speed and memory, contributing to repeated “AI winters” as progress stalled and funding dried up.

That problem is mostly gone now. Today, AI models are trained on specialized chips in huge data centers, and they can scale up in weeks instead of years. Compute, which used to be the main bottleneck, is now something that can be bought with enough money. Companies like Nvidia or AMD are also mass-producing even more powerful graphics processing units (GPUs) — components conventionally used for gaming or visualization but also well suited to processing AI calculations — as each year goes by.

So, beyond the fundamental architectures at the heart of these models, what’s keeping AI from becoming even more advanced? The new limit is far more physical in nature — and far harder to work around. It’s electricity.

Article continues below

Why AI’s energy appetite is exploding

Modern AI models don’t just train once and then stop. They run all the time, powering things like chatbots, search tools, image generators and more autonomous agents. This change has made AI a constant, large-scale user of electricity.

According to Sampsa Samila, academic director of the AI and the Future of Management Initiative at Barcelona’s IESE Business School, the problem isn’t a lack of energy in absolute terms. “It’s not the overall supply of energy, but having reliable, firm capacity at the right place and the right time that is in short supply,” he told Live Science.

Predictions for AI energy consumption show this strain clearly. The International Energy Agency (IEA) expects data centers to consume more than twice as much electricity by the end of the decade, reaching levels similar to those in major industrial economies. In some parts of the U.S, data centers already use as much power as heavy industry.

How AI is actually used matters just as much as how it’s trained. Training large language models (LLMs) still consumes a lot of power, but it tends to occur in large, infrequent runs. What’s growing faster is the everyday work — models responding to users, over and over again. Samila notes that newer “reasoning” systems, which spend more time working out an answer, push energy use into normal operations rather than occasional training bursts.

A grid built for a slower world

Power grids were designed for gradual growth, not for city-sized loads appearing almost overnight.

Juan Arismendi-Zambrano, an assistant professor at Ireland’s University College Dublin (UCD) Michael Smurfit Graduate Business School, said the main issue is timing. Large AI campuses grow faster than grid upgrades or government approvals can keep up with. This creates a real bottleneck: getting enough power, when and where it’s needed.

Current power grids were not built with AI in mind. (Image credit: Europa Press News via Getty Images)

“The ‘short supply’ of AI electricity is, in my view, less about an absolute global lack of electricity and more about local bottlenecks created by fast deployment of large data centres,” Arismendi-Zambrano told Live Science.

“These campuses scale quicker than electricity grid upgrades, or bureaucracy can respond. Especially when they land in rural areas chosen for cheap land and political ‘lobbying’ for states, but not engineered for sudden, concentrated load. The result is a very physical constraint: access to a lot of electricity power, on time, at the right node,” he said.

Clustering data centers in one area makes the problem worse. Jens Förderer, a professor at the University of Mannheim Business School in Germany, pointed to Northern Virginia’s “Data Center Alley,” where many facilities draw huge amounts of power from the same grid. Power plants, transmission lines and substations take years to build, but AI companies often start using compute much sooner, sometimes even before their buildings are finished.

“When many city-scale loads draw from the same local grid, scaling electricity provision becomes far harder,” Förderer said.

How the industry is scrambling to respond

There is no single fix for AI’s energy problem. Instead, companies are pursuing several strategies at once.

One is building power closer to the data centers themselves. Large tech firms have signed long-term contracts to support new power generation, including nuclear plants, and are exploring on-site power where grid upgrades move too slowly.

Google, for example, has been doing this in Texas through its acquisition of energy developer Intersect, which builds large-scale solar and storage projects alongside data center demand rather than waiting for grid upgrades. Microsoft, meanwhile, has signed a long-term deal with Constellation Energy tied to the planned restart of a nuclear reactor at Pennsylvania’s Three Mile Island site to supply power for its data centers.

Another is choosing locations based on electricity, rather than users. As Förderer noted, data centers are increasingly sited where power is easiest to scale, even if that means moving further from major population centers.

Then there is reuse — including a surprising source. Former cryptocurrency mining facilities are emerging as candidates for AI workloads. Once criticized for their energy use, these sites already have what AI needs most: large grid connections, cooling systems and experience running power-hungry hardware around the clock. The crossover between Bitcoin and AI may look strange, but the underlying physics is the same.

“These facilities already have large grid connections, and some former miners may pivot toward AI workloads,” Förderer said.

Canadian miner Bitfarms has recently announced plans to transition its facilities away from Bitcoin mining toward high-performance computing and AI data centers, while Hut 8 — originally a Bitcoin mining company — struck a major $7 billion lease deal in late 2025 to provide data-center capacity for AI computing

Some ideas look even further afield. Space-based data centers are sometimes pitched as a way to sidestep Earth’s grid entirely, using constant solar energy and the cold of space for cooling. Samila said the idea works on paper, but the numbers get intimidating fast.

Energy is necessary but not sufficient

Sampsa Samila, academic director of the AI and the Future of Management Initiative at Barcelona’s IESE Business School

A single 5-gigawatt facility would require around 2.5 by 2.5 miles (4 by 4 kilometers) of solar panels in orbit. It’s “in principle doable,” he added, but only with some serious engineering. Latency, upkeep and launch logistics remain open questions.

Efficiency may be the fastest lever of all. Förderer pointed out that advances in chips, model design and system architecture have already reduced the energy required per unit of intelligence. Some recent efforts include an MIT breakthrough that aims to cut energy use by stacking components vertically, as well as a “rainbow-on-a-chip” that uses lasers to transmit data in components.

Such gains won’t eliminate the need for more power, but they can slow the rate at which demand grows.

Does unlocking energy unlock smarter AI?

The growing demand placed upon the electricity grid by AI also raises environmental concerns. Engineer Aoife Foley, professor and chair in Net Zero Infrastructure at the University of Manchester in the U.K., pointed out that the wider IT sector already makes up about 1.4% of global carbon emissions.

AI workloads use much more energy than regular cloud computing, and while big tech companies are investing in renewables and better cooling, Foley said these efforts alone are not enough.”These impacts can be reduced through smarter model optimisation and a closer alignment between data centre strategy and regional renewable generation,” she told Live Science.

Despite the scale of the challenge, none of the experts see electricity as a shortcut to artificial general intelligence (AGI) —a hypothetical form of AI that can simulate behaviour as intelligent as, or more intelligent than, that of a human being. More energy makes it easier to build and run bigger systems, but it doesn’t solve the harder problems. Instead, Förderer argued that the real limits sit elsewhere — in access to data, in new model architectures and in genuine advances in reasoning.

“Energy is necessary but not sufficient,” Samila said in agreement, adding that today’s dominant approach to improving AI relies on massive amounts of power, but more electricity alone will not magically produce AGI.

More energy doesn’t guarantee smarter machines, but it does change who gets to participate. Access to power will shape where AI is built, who can afford to run it and how broadly it’s deployed. The bottleneck has shifted away from silicon and toward the physical world, where grids, permits, and power plants move at a very different pace than code.

Share.
Leave A Reply

Exit mobile version