Chip architecture company Arm Holdings has partnered with Meta to develop a new processor for AI data centres, marking the first time Arm has designed and delivered production silicon in its history
The Arm and Meta Partnership
The collaboration centres on the Arm AGI CPU, a processor designed to support emerging agentic AI workloads and large-scale AI infrastructure.
Meta served as the lead partner and co-developer for the chip, working with Arm to optimise infrastructure for its family of applications.
“Delivering AI experiences at global scale demands a robust and adaptable portfolio of custom silicon solutions, purpose-built to accelerate AI workloads and optimize performance across Meta’s platforms,” said Santosh Janardhan, Head of Infrastructure at Meta.
“We worked alongside Arm to develop the Arm AGI CPU to deploy an efficient compute platform that significantly improves our data centre performance density and supports a multi-generation roadmap for our evolving AI systems.”

Meta Collaboration Anchors Arm’s First Silicon Launch
The partnership with Meta comes as Arm expands its compute platform beyond licensing processor designs.
For the first time, the company is introducing Arm-designed production silicon, giving partners more ways to deploy Arm technology across AI infrastructure.
“AI has fundamentally redefined how computing is built and deployed. Agentic computing is accelerating that change,” said Rene Haas, CEO of Arm Holdings.
“Today marks the next phase of the Arm compute platform and a defining moment for our company. With the expansion into delivering production silicon with our Arm AGI CPU, we are giving partners more choices all built on Arm’s foundation of high-performance, power-efficient computing, to support agentic AI infrastructure at global scale.”

The Arm AGI CPU for Agentic AI Infrastructure
The Arm AGI CPU is designed to support the growing demand for compute power driven by AI agents—systems that continuously reason, plan and act.
According to Arm, the rise of these workloads is increasing the volume of tokens generated across AI systems and driving a need for more CPUs to manage reasoning, coordination and data movement.
The processor offers several capabilities designed for AI data centres.
Performance Details
The chip can include up to 136 Arm Neoverse V3 cores per CPU, delivering performance per core, system-on-chip, blade and rack.
It also delivers 6GB/s memory bandwidth per core with sub-100ns latency.

Infrastructure Scale
The CPU operates at 300-watt TDP and includes a dedicated core per program thread, which Arm says enables deterministic performance under sustained workloads.
The architecture supports 1U server chassis deployments, including:
- Air-cooled systems with up to 8,160 cores per rack
- Liquid-cooled systems delivering more than 45,000 cores per rack
Arm says the processor can deliver more than 2x performance per rack compared with x86 CPUs, potentially enabling up to $10 billion in capital expenditure savings per gigawatt of AI data center capacity.
Expanding the Arm Compute Platform
For more than three decades, the Arm compute platform has enabled scalable and power-efficient computing across hundreds of billions of devices.
As AI reshapes computing infrastructure, Arm says customers are looking for ways to deploy Arm technology more quickly and at greater scale.
The company is expanding its platform strategy beyond Arm IP and Arm Compute Subsystems (CSS) to include Arm-designed silicon products.
This approach allows partners to choose between licensing Arm designs, adopting compute subsystems, or deploying Arm’s own silicon.
Broad Ecosystem Support for the Arm AGI CPU
In addition to Meta, Arm says a range of companies across the AI and cloud ecosystem plan to deploy the Arm AGI CPU.
These include:
- Cerebras
- Cloudflare
- F5
- OpenAI
- SAP
- SK Telecom
These organisations are expected to use the processor for workloads including accelerator management, control plane processing, and cloud and enterprise-based API, task and application hosting.

