
"Nvidia's data center revenue exploded from $10.61 billion in fiscal 2022 to records exceeding $193.7 billion in 2025, showcasing the immense growth driven by GPU demand in AI applications."
"According to Dylan Patel of SemiAnalysis, CPU-side processing now accounts for 50% to 90% of total latency in these workloads, leaving expensive GPUs idle while the CPU orchestrates."
"The bottleneck everyone chased for years has quietly moved from GPUs to the 'boring' old CPU, indicating a significant shift in the hardware that powers AI."
Nvidia's rise in the AI market was driven by GPUs, but the focus is shifting to CPUs as agentic AI systems require more processing power. AMD, with its EPYC processors, is in a similar position to Nvidia in 2022. The AI landscape has evolved, with CPUs now handling significant workloads, leading to increased demand and rising prices for server CPUs. This shift indicates a change in the hardware narrative, moving from GPUs to CPUs as essential components in AI processing.
Read at 24/7 Wall St.
Unable to calculate read time
Collection
[
|
...
]