Amazon ready to use its own AI chips, reduce its dependence on Nvidia
Briefly

Amazon's chief executive Andy Jassy revealed the company anticipates a capital spending of approximately $75 billion in 2024, primarily focused on advancing its technology infrastructure. This marks a substantial increase from last year's expenditure of $48.4 billion. Jassy's outlook for 2025 suggests an even higher investment level, underscoring a robust commitment to infrastructure enhancement amidst a competitive AI landscape dominated by major tech entities.
Daniel Newman from The Futurum Group commented on the rapid evolution within cloud computing, indicating, "Every one of the big cloud providers is feverishly moving towards a more verticalised and, if possible, homogenized and integrated [chip technology] stack." This indicates a shift towards greater efficiency and integration in chip technology, as tech giants prioritize developing bespoke solutions.
Rami Sinno, director of engineering at Annapurna, insists that the approach is fundamentally about systems over individual components, stating, "It's not [just] about the chip, it's about the full system." This highlights the complexity involved in developing effective infrastructure technology, especially for AI, where a cohesive system is critical.
Starting with security, Amazon's progress in developing their own chips has advanced significantly. Sinno remarked, "It's really hard to do what we do at scale. Not too many companies can." This reflects the unique position Amazon holds in the market, due to its scale and the intricacies involved in building bespoke system architecture.
Read at Ars Technica
[
|
]