Future Tech

Arm CEO warns AI's power appetite could devour 25% of US electricity by 2030

Tan KW
Publish date: Wed, 10 Apr 2024, 07:59 AM
Tan KW
0 428,800
Future Tech

Arm CEO Rene Haas cautions that if AI continues to get more powerful without boosts in power efficiency, datacenters could consume extreme amounts of electricity.

Haas estimates that while US power consumption by AI datacenters sits at a modest four percent, he expects the industry to trend towards 20 to 25 percent usage of the US power grid by 2030, per a report from the Wall Street Journal. He specifically lays blame at popular large language models (LLMs) such as ChatGPT, which Haas described as "insatiable in terms of their thirst."

The Arm CEO isn't alone in making this prediction. The International Energy Agency's (IEA) Electricity 2024 report [PDF] expects power consumption for AI datacenters around the world to be ten times the amount it was in 2022. Part of the problem is that LLMs like ChatGPT require far more power than traditional search engines like Google. The IEA estimates that one ChatGPT request consumes almost ten times as much power as a Google search.

If Google were to switch its search engine entirely to AI software and hardware, it would increase its power draw by ten times according to the report, requiring an extra 10 terawatt-hours (TWh) of electricity per year. The Electricity 2024 report says government regulation will be necessary to keep the power consumption of datacenters (AI or otherwise) in check.

Some countries, like Ireland, may even see a third of its electricity used by datacenters in 2026. But it seems that the power shortage in Ireland is already starting. Amazon Web Service servers there seem to be hindered by power limitations.

Increasing efficiency as Haas suggests is one possible solution to the crisis since it's hard to imagine datacenters reducing power by compromising on performance. Even if AI hardware and LLMs get more efficient, that doesn't necessarily mean electricity usage will go down. After all, that saved energy could simply be used to expand computing capacity, keeping power draw the same.

Instead, increasing capacity seems to be the way forward for companies like Amazon, which recently acquired a nuclear-powered datacenter in Pennsylvania. While rapidly increasing power consumption on a global scale probably isn't a good thing and is bound to be very expensive, at least it could make power greener, maybe, hopefully. ®

 

https://www.theregister.com//2024/04/09/ai_datacenters_unsustainable/

Discussions
Be the first to like this. Showing 0 of 0 comments

Post a Comment