Last week, economist Paul Kedrosky published a chart showing new US data center capacity additions halved between Q3 and Q4 2025 — and that was just the additions to the pipeline, not anything being built. The underlying Wood Mackenzie data is the more useful number: of 241GW of disclosed US data center capacity, only about a third is under active development. Only around 5GW is actually under construction. The real US data center absorption in 2025 was roughly 3GW of IT load.

The headline number is 241. The reality is 5. The gap is 236 gigawatts of announced capacity that exists, at this moment, mostly on paper.

Ed Zitron, who has been tracking the AI infrastructure story for months, called this lying. That’s not quite wrong, but I think the more interesting question is: lying for what? A gap between announcement and construction that large isn’t an embarrassing miscalculation. It’s a mechanism.

Here’s what 241GW of announced capacity actually does before a single server comes online. It moves GPU orders — NVIDIA is reportedly selling Blackwell units years in advance against projects that may never materialize. It mobilizes private credit — Bloomberg has reported on multi-gigawatt portfolio deals in private credit markets funding data center development at scale. It reserves grid positions — PJM, one of the largest regional grid operators in America, has committed power to data centers at roughly three times the rate that new generation is coming online. That’s not a utility making a planning error. That’s a reservation system where the reservations themselves cost nothing but hold the spot.

Meanwhile, 58% of the committed power in the pipeline is “wires-only” — utilities that have agreed to provide transmission to a facility, not to generate the power that will flow through those wires. The generation problem is someone else’s to solve later. The commitment, however, is already on the books.

This is how substrate capture works, and it’s worth being precise about what substrate means here. It’s not just physical hardware. It’s the grid interconnection queue. It’s the long-term power purchase agreements. It’s the zoning approvals and the water rights and the political relationships with state utilities commissions. These are positions that, once held, are difficult for anyone else to occupy — whether or not a building gets constructed on top of them.

Jensen Huang said at GTC last week that computing demand has increased by a million times in two years. That may be true in some dimension. But the announcement of 241GW of capacity isn’t forecasting demand — it’s staking claims against infrastructure that takes years to build, against power that nobody has generated yet, at a scale that guarantees the people who moved first will be the ones deciding who gets access when the capacity actually arrives.

The counterargument is that this is just how large capital projects work. Power plants and data centers take years; developers announce projects speculatively; some fail, most eventually deliver. Calling it strategic manipulation mistakes construction timelines for intentional deception.

That’s a fair objection. But it misses who benefits from the gap being invisible. Capex figures move markets, justify valuations, lock in government partnerships. When Stargate announced $500 billion in US AI investment, the political effect was immediate — regardless of what gets built and when. The announcement was the move. The 241GW number serves the same function at the infrastructure layer: it makes the field look occupied. And occupied fields don’t attract new entrants.

By the time the gap between announced and built becomes a story — which it now is, quietly, in the financial press — the power positions are already set. The reservations are held. The credit is deployed. The political relationships are established.

Whether or not the buildings come, the infrastructure already did its work.