Arm CEO has power answer to AI’s ‘insatiable need’
Arm CEO Rene Haas has urged industries to rethink their AI policies because the power required to service AI has become an “insatiable need”.
Researchers have been raising the alarm on AI’s power consumption. Data servers already consume at least one per cent of the world’s electricity annually – and this is expected to triple by 2030 as the demands of AI increase to include e-commerce, healthcare, industrial processes, and agriculture.
“This insatiable need for compute has exposed a critical challenge – the immense power data centers [sic] require to fuel this groundbreaking technology,” writes the head of the world’s leading chip design company. “Finding ways to reduce the power requirements for these large data centers is paramount to achieving the societal breakthroughs and realising the AI promise.”
Haas adds: “In other words, no electricity, no AI. Companies need to rethink everything to tackle energy efficiency.”
Half of the energy required from data servers comes from the compute chip, and Arm has multiple solutions which together “could save upwards of 15 per cent of the total data center power”.
Engineers are quite naturally looking for ways to reduce power usage, says Haas, who outlines how Arm’s Neoverse “is on the path to being the de-facto standard across cloud data centers”.
He notes: “Amazon, Microsoft, Google, and Oracle have now all adopted Arm Neoverse technology to solve both general-purpose compute and CPU-based AI inference and training.”
Already, Amazon Web Services’ Graviton 2 processor has Neoverse N1 cores delivering its computational power. Google’s Axion processors use a custom-built Arm processor. Both these processing suites deliver energy savings of 60 per cent over competitors.
“It’s even better at Oracle: Oracle Cloud’s Ampere Altra A1 compute platform offers “2.5 times more performance per rack of servers at 2.8 times less power versus traditional competition”.
Haas concludes: “Those enormous savings could then be used to drive additional AI capacity within the same power envelope and not add to the energy problem. To put it in perspective, these energy savings could run two billion additional ChatGPT queries, power a quarter of all daily web search traffic, light 20 per cent of American households, or power a country the size of Costa Rica.”