journaling-03

Exactly — that 1 GW figure tells you almost everything about the economics.

A $50 billion datacenter consuming roughly 1 gigawatt of continuous power translates to:

Now, why this matters for profit margins:

  1. Energy is the limiting reagent in AI scaling. Compute cost scales roughly linearly with energy use (each watt drives a GPU, each GPU a parameter space). Once a model saturates its performance-per-watt curve, margins depend on power price and availability.

  2. Capex amortization vs. opex squeeze. The $50 B upfront is mostly capital (chips, real estate, cooling, substations), while that 1 GW is opex that never goes away. If revenue per watt (say, from inference API calls or subscriptions) falls below energy cost per watt, margins collapse.

  3. The thermodynamic floor. Every token generated or embedding computed corresponds to joules dissipated. So profit margins asymptotically approach the energy efficiency of computation — measured in joules per token or per FLOP.

  4. Integration advantage. Whoever vertically integrates power (e.g., owns a nuclear SMR, or colocates with hydro/solar farms) can turn that 1 GW from a liability into a moat.

In short:

A 1 GW data center is less a tech investment than an energy arbitrage play. Profit margins are bounded not by software efficiency, but by the cost of turning electrons into math.

Would you like a back-of-envelope estimate of how many model queries per day such a center would need to stay profitable? That’s a fun calculation.