Speech & Audio

We’re all going to be paying AI’s Godzilla-sized power bills • The Register

We’re all going to be paying AI’s Godzilla-sized power bills • The Register


Opinion When I was a wet-behind-the-ears developer running my programs on an IBM 360, a mainframe that was slower than a Raspberry Pi Zero W, my machine used about 50 kilowatts (kW). I thought that was a lot of power. Little did I know what was coming.

Today, a large, AI-dedicated datacenter typically requires 100 megawatts (MW). That’s roughly equivalent to the energy used by 100,000 homes. That’s a lot of power, but it’s not that much more than your typical hyperscaler datacenter. However, there are already a lot of AI datacenters. By last count, we’re up to 746 AI datacenters.

Think that’s a lot? That’s nothing compared to where we’re going.

It appears that AI-ready datacenters will be growing at a compound annual growth rate (CAGR) of 33 percent per year between today and 2030. That’s a heck of a lot more datacenters, which, in turn, means a hell of a lot more power consumption.

Why? Well, AI sucks so much power down because training and operating modern models, especially generative AI, requires extremely intensive computational resources and vast amounts of data processed in parallel across large clusters of high-performance GPUs and TPUs.

For example, the training phase of a state-of-the-art AI model requires repeated adjustment of billions to trillions of parameters. That process alone requires thousands of GPUs running simultaneously for weeks or months at a time. Adding insult to injury, each of those special AI chips draws far more juice than your run-of-the-mill CPU.

But once the training is done, it doesn’t take that much power, does it? It does. While the AI companies are remarkably reticent about how much energy is consumed when you ask ChatGPT to tell you a knock-knock joke, render a picture of David Tennant as Dr Who, or create a ten-second video of the characters from Star Trek: Lower Decks telling Dr Who a knock-knock joke, we know that answering even simple, non-trivial questions requires a lot of power.

Whether it’s learning or answering questions, these AI chips are hot as hell. Your run-of-the-mill AI chips run at 70°C to 85°C – that’s 158°F to 185°F for those of us on the left side of the pond. And you thought your GeForce RTX 5090 was hot stuff!

In practice, that means up to 20 percent of an AI datacenter’s power consumption goes to just keeping the boards from melting down.

Put it all together, and today’s large, state-of-the-art AI datacenters are approaching and sometimes exceeding 500 MW, and next-gen sites in planning stages are targeting 2 gigawatts (GW). The nonprofit American Council for an Energy-Efficient Economy (ACEEE) estimates that those datacenters will consume “nearly 9 percent of total US grid demand by 2030.”

But that’s nothing compared to what’s coming down the road.

Take OpenAI. For OpenAI to fulfill its ambitious datacenter plans, it needs a minimum – minimum – of 16 gigawatts (GW) of sustained power. That’s enough to rival the entire electricity demand of countries like Switzerland or Portugal. The OpenAI Stargate project alone needs 10 gigawatts (GW) of datacenter capacity across several phases in the United States by 2029. To quote Nvidia CEO Jensen Huang: “This is a giant project.” You think!?

But as grandiose as OpenAI’s plans are, the other would-be AI superpowers are also pushing forward with plans that are just as big. Amazon, for example, in partnership with Anthropic, is building Project Rainier. Its initial cluster of datacenters in Indiana will gobble down 2.2 GW.

Microsoft asserts its Fairwater cluster in Mount Pleasant, Wisconsin, which has already suffered through one tech boondoggle with Foxconn, will be the largest AI datacenter of its kind. Microsoft’s president, Brad Smith, piously claims it will build a 250 MW solar farm, which will match every kilowatt hour it uses from fossil fuels. The Clean Wisconsin group believes Fairwater will need more like 2 GW. I buy their numbers, not Microsoft’s.

I mean, Microsoft is also the company that’s planning on bringing the Three Mile Island nuclear reactors back online. Do you remember Three Mile Island? I do. No, thank you. Besides, even when fully operational, those reactors only had a generating capacity of 837 MW.

Just for giggles, I did a back-of-the-envelope calculation on how big a solar farm would need to be to generate a single TW of power. With the current state of solar power, the rule of thumb is that it takes five acres of solar panels to deliver one MW. So, a TW, a million MW, needs five million acres, or 7,812 square miles. Yeah, that’s not going to scale, especially in a Wisconsin blizzard in December.

Here’s the simple truth. The AI companies’ plans are fantasies. There is no way on Earth the electric companies can deliver anything like enough juice to power up these mega datacenters. Even Trump’s Department of Energy, a nuclear power cheerleader, admits it takes years to bring new nuclear power reactors online.

Coal? Hydropower? Gas? Please. As Deloitte gently puts it: “Few energy resources align with datacenter timelines.” If we can wait until 2040, then we might have enough power to support all these AI pipe dreams. Maybe.

The utilities will certainly do their best so they’re pushing their building plans as fast as possible. There’s only one little problem with that. Recall the project manager’s mantra: “You can have something that’s good, cheap, or fast – pick two.” Guess what? They’ve picked “good and fast,” so someone has to foot the bill. Guess who?

Yes! It will be you and me. A Bloomberg News analysis of wholesale electricity prices shows “electricity now costs as much as 267 percent more for a single month than it did five years ago in areas located near significant datacenter activity.” Those bills are going to skyrocket in the next few years.

I see a race coming between the bursting of the AI bubble, the cracking of our already overburdened electrical grid, and all of us shivering in the winter and baking in the summertime, as AI-driven costs and brownouts make us miserable in our homes.

In a word: “Yuck!”

But, hey, if I were a betting man, I’d bet the AI companies will fail first. It’s not the win we may have wanted, but it’s the win we’ll get. ®

We're all going to be paying AI's Godzilla-sized power bills • The Register

Source link