Tuesday, March 17, 2026

CPUs gain importance as agentic AI reshapes data centers

Processors long considered secondary to GPUs in artificial intelligence (AI) systems are drawing renewed attention as “agentic AI” drives more complex, multi-step workloads inside data centers.

At AMD’s Advancing AI event, CEO Lisa Su described agentic AI as systems that operate continuously, accessing data and applications to make decisions and complete tasks with minimal human input.

This shift, the company said, is increasing the role of central processing units (CPUs) in coordinating AI operations.

While graphics processing units (GPUs) remain critical for training models and handling parallel computations, CPUs are responsible for orchestrating workloads, managing memory, preparing data, and controlling system operations.

These functions are becoming more demanding as AI moves from training to inference — where tasks involve multiple steps, decision-making, and interaction with enterprise systems.

“As agentic AI proliferates, inference becomes a multistep workflow driving new demand for CPU compute,” AMD said.

The company noted that in modern AI clusters, CPUs handle scheduling, data movement, and control flow to keep accelerators running efficiently. This coordination is seen as essential in maintaining overall system performance.

AMD cited internal estimates showing that its 5th Gen EPYC processors can deliver up to 2.1 times higher performance per core and up to 2.26 times better energy efficiency, measured in operations per watt, compared with Nvidia’s Grace Superchip-based systems.

The growing complexity of inference workloads is also shifting the CPU’s role. Instead of primarily feeding data to GPUs during training, CPUs are increasingly tasked with interpreting results, routing information, and managing decision logic in real time. In agentic AI systems, they also handle API calls, tool usage, and memory queries.

“Agentic AI is expanding what AI can do. It’s also reinforcing a truth every data center architect already knows: The best AI outcomes come from balanced systems,” AMD said.

The company is positioning its EPYC processors alongside its Instinct GPUs, Pensando networking technologies, and ROCm software stack as part of a unified AI infrastructure.

It added that its next-generation EPYC chips, codenamed “Venice,” are expected to power future AI systems, including its planned “Helios” rack-scale architecture.

Industry-wide, the rise of AI is driving increased demand for compute resources and prompting a new cycle of server upgrades, with system-level performance and energy efficiency becoming key considerations for data center operators.

- Advertisement -spot_img

RELEVANT STORIES

spot_img

LATEST

- Advertisement -spot_img