Hackster is hosting Hackster Holidays, Ep. 7: Livestream & Giveaway Drawing. Watch previous episodes or stream live on Friday!Stream Hackster Holidays, Ep. 7 on Friday!

Arm's Rene Haas Coughs to AI's "Insatiable Energy Needs," But Says His Company Can Help

With data center power needs expected to treble by 2030, something has to change — and, Haas says, that should start with a move to Arm.

Arm chief executive officer Rene Haas has admitted "insatiable energy needs" of the current explosion in artificial intelligence (AI), and in particular generative AI (gen AI) training and operation — but says his company is uniquely positioned to help, pointing to its latest Neoverse CPU IP as offering better power efficiency than its rivals.

"AI has the potential to exceed all the transformative innovations created in the past century. The benefits to society around health care, productivity, education and many other areas will be beyond our imagination," Haas claims. "To run these complex AI workloads, the amount of compute required in the world’s data centers needs to exponentially scale. However, this insatiable need for compute has exposed a critical challenge: the immense power data centers require to fuel this groundbreaking technology."

Today, Haas says, data centers already draw around 460 terawatt-hours of electricity each year — equivalent to the entire annual energy consumption of Germany. By 2030, he claims, demand for artificial intelligence will have seen this figure triple to more than the total power consumption of India, the world's most populated country. "Companies need to rethink everything to tackle energy efficiency," he says — and that thinking should include a shift to Arm technology, naturally.

Currently most data centers run on x86 processors from Intel or AMD, coupled with accelerators from Intel, AMD, or NVIDIA. While Arm has enjoyed considerable success in embedded and portable computing, though, it has long struggled to break into the data center — something Haas says is shifting with the release of its latest Neoverse processor technology, which has been adopted by Amazon, Microsoft, Google, and Oracle.

"As Arm deployments broaden, these companies could save upwards of 15 percent [of] the total data center power," Haas says. "Those enormous savings could then be used to drive additional AI capacity within the same power envelope and not add to the energy problem. To put it in perspective, these energy savings could run two billion additional ChatGPT queries, power a quarter of all daily web search traffic, light 20 percent of American households, or power a country the size of Costa Rica."

To put hard figures to the claims, Haas points to Amazon's Arm-based Graviton processors, which deliver a claimed 25 percent performance boost for AI inference with a 60 percent efficiency gain over their direct competition, Google's Axion, which pushes a 50 percent performance and 60 percent efficiency gain over competitors, and Microsoft's Cobalt, which delivers a claimed 40 percent performance gain.

Arm is far from being the only company that thinks it has the answer to the growing energy demands of the AI shift, though: Intel has just finished deploying its Loihi 2-based Hala Point system at Sandia National Laboratories, using brain-inspired neuromorphic computing to deliver efficiency gains over traditional CPU and GPU parts, while NVIDIA's Blackwell platform promises a 25-fold energy efficiency improvement over the company's previous generation equivalent — with Arm's processor IP sitting alongside NVIDIA's GPU design.

More information on Arm Neoverse is available on the company website.

Gareth Halfacree
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles