- Monday Momentum
- Posts
- The New AI Bottleneck
The New AI Bottleneck
Why AI's future runs on power, not processing, and what this means for the future of US infrastructure
Happy Monday!
Here's what everyone missed while watching Nvidia’s massive returns this year: the energy stocks are winning.
Lumentum is up 372% in 2025. Nvidia? 97%. Seagate and Western Digital both tripled their value. Even Celestica, a company most people have never heard of, outperformed the chip king.
Why? Because we solved the GPU shortage and immediately hit a bigger wall: electricity.
Microsoft's CEO admitted last week his company "doesn't have enough electricity to install all the AI GPUs in its inventory." OpenAI had to stagger its GPT-4.5 launch in February because it "ran out of GPUs." Now? They can get all the chips they want. They just can't power them.
Grid connection queues in Northern Virginia stretch 4-7 years. Data center deals hit a record $61 billion in 2025, with a $40 billion acquisition led by BlackRock, Microsoft, and Nvidia buying Aligned Data Centers: not a chip company, not a model lab, but physical compute capacity with guaranteed power.
The AI arms race just changed. It's not about who trains the biggest model anymore. It's about who can keep the lights on.
Infrastructure stocks are outperforming AI chip leaders because power, not processing, is the new constraint. Data center power demand will jump 165% by 2030. Grid connections take 4-7 years in key markets. Single AI training runs could require 8GW by 2030 (equivalent to eight nuclear reactors). Companies are buying data centers for power capacity, not raw compute. The competitive moat in AI is increasingly about energy access, not model architecture.
From Chips to Kilowatts
For three years, the narrative was simple: GPUs are the bottleneck. Nvidia became the most valuable company on earth. Everyone fought for allocation. Hyperscalers rationed compute to their best customers.
That's now over.
The new constraint is power. And unlike chips, you can't just manufacture more capacity overnight.
The numbers tell the story. U.S. data centers consumed 183 terawatt-hours (TWh) in 2024 which equates to about 4% of total electricity. By 2030, that number jumps to 426 TWh, a 133% increase. Data centers could consume 12% of all U.S. electricity within five years.
Globally, AI data centers need 10 gigawatts of new power capacity in 2025, which is more than the state of Utah's total power capacity. If exponential growth continues, they'll need 68GW by 2027. That's nearly equivalent to California's entire power grid.
Here's where it gets worse: a single large-scale AI training run could require 1 gigawatt by 2028. By 2030? That number becomes 8 gigawatts, equivalent to eight nuclear reactors, for one training job.
And the infrastructure to support this doesn't exist yet.
Grid upgrade costs through 2030 are estimated at $720 billion. Transmission projects take years to permit and years more to build. In hot markets like Virginia, the queue for grid connections is 4-7 years long. You can't dial this down without breaking performance. You can't just build somewhere else because suitable sites with power, cooling, and network access are finite.
Microsoft is on track to spend $80 billion on AI data centers in fiscal 2025. But CEO Satya Nadella's team has a problem: they have more GPUs than they can plug in. The company even restarted Three Mile Island's nuclear plant to secure 24/7 carbon-free electricity for its AI infrastructure.
The Money Knows
While everyone was watching model releases and benchmark wars, energy infrastructure stocks quietly went parabolic.
Lumentum, a company that makes optical connections for AI servers, is up 372% in 2025. Sales jumped 58% in its most recent quarter. The company's CEO said 60% of revenue now comes from cloud and AI infrastructure. Every GPU in a rack needs to be connected to every other GPU. Future systems will scale out across racks, data centers, and entire regions. That requires fiber-optic connections at scale.
Seagate is up 231%. Western Digital tripled. The story is the same: AI needs massive amounts of storage. A single hospital using AI to analyze medical imaging processes 7 billion images. Hard drives provide the most cost-effective storage solution, and data centers need larger, more expensive drives for AI workloads.
Even Micron, a memory chip maker, shut down its entire consumer memory business in December to redirect supply toward AI. The company is "more than sold out" of memory chips and reported the best revenue upside in the history of the U.S. semiconductor industry, excluding Nvidia.
The pattern is clear: AI's infrastructure layer is where the money is flowing. Not just chips, but everything that makes the chips work.
And increasingly, that means power.
Data center deals hit $61 billion through November 2025, breaking last year's record. The largest single deal: a $40 billion acquisition of Aligned Data Centers by a consortium including BlackRock, Microsoft, and Nvidia. They didn't buy a chip company or an AI lab. They bought physical capacity with secured power.
Alphabet just paid $4.75 billion for Intersect, an AI data center company. Microsoft signed a 200-megawatt deal with IREN in November. OpenAI committed over $1 trillion in infrastructure spending from 2025-2035 across Oracle, Microsoft, AWS, Nvidia, AMD, and CoreWeave.
The Geographic Realignment
For decades, tech concentrated in talent hubs: Silicon Valley, Seattle, Austin, New York. Access to engineers dictated location.
That's now changing in this new landscape.
Pennsylvania just launched a state program to become a top AI data center hub by solving the electricity bottleneck first. Instead of subsidizing facilities directly, the state is prioritizing deliverable power through grid upgrades and workforce development. As the second-largest U.S. producer of natural gas, with Appalachian Basin shale and legacy hydro assets being repowered, Pennsylvania's energy advantage is the selling point.
Analysts say this could redirect growth from Northern Virginia, traditionally the largest data center market in the U.S. Why? Because Northern Virginia's grid is maxed out. New builds face multi-year waits. Pennsylvania is offering guaranteed power now.
This is unprecedented. States are competing on energy infrastructure, not talent pools or tax incentives.
Companies are responding by going off-grid. Behind-the-meter strategies and "Bring Your Own Power" (BYOP) approaches are accelerating. Instead of waiting years for utility connections, operators are deploying on-site or near-site generation.
Right now, that means natural gas. Long-term, it means exploring small modular nuclear reactors (SMRs) for reliability. Pennsylvania's strategy explicitly includes nuclear.
The implication: AI infrastructure is clustering around power hubs, not coastal tech centers. If you're building AI infrastructure and your site selection strategy is still "near where the engineers live," you're already behind.
Grid capacity is the constraint. Companies with secured power have a competitive advantage. Everyone else is stuck in queue.
The Bottom Line
Three years ago, the constraint was GPUs. We manufactured our way out.
Two years ago, it was model architecture. We scaled our way out.
Now it's power. And you can't manufacture electricity or scale permitting timelines.
The companies that win the next phase of AI won't necessarily have the best models or the most compute. They'll have guaranteed access to gigawatts of power when competitors are stuck in grid connection queues.
Infrastructure stocks are outperforming AI stocks because the market is starting to price this in. Energy is the new moat. Geography is realigning around it. Competitive advantage is shifting from "who can train the biggest model" to "who can keep it running."
If your AI strategy doesn't account for where the electricity comes from, you don't have an AI strategy. You have a wishlist.
In motion,
Justin Wright
If the constraint on AI progress has shifted from compute power to electrical power, and if securing that power now requires multi-year commitments and geographic realignment, are we entering an era where AI leadership is determined less by algorithmic innovation and more by infrastructure control?

AI to drive 165% increase in data center power demand by 2030 - Goldman Sachs Research
How Data Centers Redefined Energy and Power in 2025 - Data Center Knowledge
AI infrastructure stocks Lumentum, Celestica, Seagate beat Nvidia 2025 - CNBC
Memory loss: As AI gobbles up chips, prices for devices may rise - NPR
AI's Power Requirements Under Exponential Growth - RAND Corporation
Can US infrastructure keep up with the AI economy? - Deloitte
Global data centre deals hit record $61 billion in 2025 - Tech News Hub
Microsoft to spend $80bn on AI data centers in 2025 - Data Center Dynamics
Energy demand from AI - International Energy Agency

I am excited to officially announce the launch of my podcast Mostly Humans: An AI and business podcast for everyone!
Episodes can be found below - please like, subscribe, and comment!