
""...in the past, the CPU to GPU ratio was primarily just as a host node in like a 1:4 or 1:8 configuration node, now changing and getting closer to a 1:1 configuration or even.""
""Training a frontier model is GPU-heavy. Running inference at scale with agentic systems spawning sub-tasks is much more CPU-balanced.""
""Su now expects the server CPU TAM to grow at greater than 35% annually, reaching over $120 billion by 2030, roughly double her November guidance.""
""AMD already owns the share-gain story against Intel (NASDAQ:INTC) in EPYC, with server CPU revenue up more than 50% YoY and Su targeting greater than 50% share of that market.""
AI data center hardware is expected to move from a GPU-dominant configuration toward a more balanced CPU and GPU mix. Training frontier models remains GPU-heavy, but large-scale inference combined with agentic workflows increases CPU involvement. Server CPU total addressable market is projected to grow at more than 35% annually, reaching over $120 billion by 2030. AMD is positioned to benefit through EPYC share gains versus Intel, with server CPU revenue growing more than 50% year over year and a target of over 50% market share. Data center results show AMD outpacing Intel’s growth, while Instinct GPU deployments and customer commitments support continued AI infrastructure expansion.
#ai-data-centers #cpu-vs-gpu-mix #amd-epyc #inference-and-agentic-workloads #server-cpu-market-growth
Read at 24/7 Wall St.
Unable to calculate read time
Collection
[
|
...
]