Power Plants of AI
These facilities consume extraordinary amounts of electricity, measured not in megawatts but in gigawatts. GPT-3, when it launched, required roughly 1.3 megawatts to train. Next-generation frontier models are projected to require facilities drawing 150 megawatts or more on a continuous basis — the equivalent of powering a mid-sized city. At that scale, the constraint is no longer silicon. It is power.
Almost every watt consumed by a processor ultimately becomes heat. A large AI cluster therefore generates enormous thermal loads that must be dissipated continuously — not occasionally, but every second of every hour of operation. A gigawatt-scale facility must remove roughly a gigawatt of heat.
Instead of building data centers first and connecting them to the grid later, developers are increasingly looking for the opposite arrangement: locating compute directly adjacent to major sources of generation. The logic is straightforward. If the grid cannot deliver power fast enough, go to where the power already is.