Nvidia might actually lose in this key part of the AI chip business
Briefly

Nvidia's CFO, Colette Kress, stated that inference comprised approximately 40% of Nvidia's data center revenue, amounting to around $10.52 billion from a total of $26.3 billion.
AWS CEO Matt Garman highlighted that inference represents likely half of all AI computing server work today and predicts that its market share is set to expand further.
Groq, backed by $640 million, is focusing on inference hardware to compete with Nvidia, indicating a strong market push for alternatives in this segment.
Cerebras has introduced its new large-scale inference chip that is claimed to be the fastest in the market, showcasing the growing competition in the AI inference chip space.
Read at Business Insider
[
|
]