This Week in Voltage
The most important energy policy move of the month didn't come from the White House or a utility earnings call. It came from a four-paragraph order issued April 16 by the Federal Energy Regulatory Commission — and if you're paying attention to where civilization's electricity future gets decided, this is the document to read.
FERC announced it will act by end of June 2026 on a sweeping rulemaking to rewrite how large power users — defined as loads above 20 MW — connect to the U.S. transmission grid. The docket, RM26-4-000, was triggered by a DOE draft notice from October 2025 that named AI data centers explicitly as the forcing function behind "unprecedented current and expected growth of large loads." FERC reviewed more than 3,500 pages of public comments before issuing Thursday's order. That's not bureaucratic theater — that's a commission that knows the stakes.
The June deadline is the milestone to watch. Whatever FERC produces will set the rules for how the next generation of compute infrastructure plugs into American power. Get it wrong, and you've built a legal and logistical wall between AI ambition and electrical reality. Get it right, and you've opened the floodgates for the most consequential infrastructure buildout since the interstate highway system.
Deep Charge: Data Centers Aren't Tech Infrastructure. They're Power Infrastructure.
There's a framing problem at the center of every policy debate about AI data centers, and it's costing us time we don't have.
Politicians, regulators, and even most investors still categorize data centers as technology infrastructure — a cousin of server farms and fiber optic cables, something the tech industry builds and operates and the rest of us consume. That framing is wrong, and the FERC rulemaking exposes exactly why.
Data centers are electricity infrastructure. They are, functionally, large industrial loads that happen to produce computation instead of aluminum or steel. The policy questions they raise — grid interconnection, transmission capacity, cost allocation, reliability obligations — are the same questions we ask about smelters and chemical plants. The difference is scale and speed: no industrial category in American history has sought to add this much load this fast.
The DOE's October 2025 draft notice put it plainly: large loads like AI data centers must be able to connect to the transmission system "in a timely, orderly, and non-discriminatory manner" to support affordable, reliable, and secure electricity. That's the language of grid planning, not tech policy. And it signals that at least some corners of the federal government have made the conceptual leap.
The Brookings analysis of AI's global energy demands — updated April 2, 2026 — frames the governance challenge clearly: rapid AI-driven data center growth is driving increasing electricity and water consumption worldwide, and international measurement, standards, and reporting tools are still catching up. The infrastructure is moving faster than the frameworks designed to manage it. That's not a crisis — it's an opportunity, if regulators move at the speed the moment demands.
FERC's June deadline suggests they understand this. The commission's December 2025 order directing PJM — the nation's largest grid operator — to establish transparent rules for large load co-location was an early test. June's rulemaking is the main event.
I'd argue the deeper issue is that we've been asking the wrong question. The debate has centered on whether data centers impose unfair costs on ratepayers, or whether they jump the interconnection queue ahead of other projects. Those are real concerns. But the civilizational question is different: what is the cost of not connecting them? Every month of delay in grid interconnection is a month of compute capacity that doesn't exist, AI capability that doesn't deploy, and economic output that doesn't materialize. The bottleneck isn't ambition. It's wire and regulatory process.
By the Numbers
- 20 MW — the threshold above which a load is classified as "large" under the FERC docket, the category that now includes most serious AI data center deployments
- 3,500+ pages — public comments reviewed by FERC before issuing its April 16 order, per FERC's own filing
- End of June 2026 — the hard deadline FERC has set for action on large-load interconnection rules
- $83 / $78 per barrel — Goldman Sachs's 2026 Brent/WTI forecast, per Reuters, with flagged two-way risks from Middle East uncertainty — a reminder that oil volatility keeps accelerating the economic case for electrification
What We're Fighting For
The FERC rulemaking in June isn't a regulatory footnote. It's a load-bearing wall in the structure of the next economy.
Every civilization-scale technology — railroads, electrification, the internet — required a moment when regulators stopped treating the new thing as a novelty and started treating it as infrastructure. We are at that moment for AI compute. The question is whether the rules we write in the next 60 days are built for the world we're entering or the world we're leaving.
Watch June. The wire is ready. The question is whether the rulebook catches up.
