What are Google TPUs, and why are they bad news for Nvidia?
Briefly

What are Google TPUs, and why are they bad news for Nvidia?
"Shares of Nvidia and other chipmakers tumbled last month following a report that Meta - one of Nvidia's largest customers - was exploring a deal to use Google's AI chips, known as Tensor Processing Units, or TPUs. Google has primarily used its TPUs for internal use, but it also leases them to external customers through the cloud. Nvidia, meanwhile, has become the dominant provider of AI chips with its graphics processing units, or GPUs. Google has a potential blockbuster business to unlock."
"In a research note sent December 2, Morgan Stanley projected that 5 million of Google's TPUs will be purchased in 2027 and about 7 million in 2028, significantly increasing its prior projections. Here's a breakdown of everything you need to know about TPUs, what they're used for, and when they might become a more prominent threat to Nvidia's chip dominance. What are TPUs? Over a decade ago, Google needed more powerful and specialized compute power for the type of AI work it wanted to do."
Google developed Tensor Processing Units (TPUs) over a decade ago to provide specialized compute for machine learning workloads. TPUs accelerate both model training and inference, and later versions increased memory bandwidth to support larger language models. Google primarily used TPUs internally and also leases them to external customers through the cloud. Interest from major customers such as Meta prompted market reactions and heightened competition with Nvidia's GPUs. Morgan Stanley projected millions of TPU purchases in 2027–2028, signaling substantial commercial opportunity. The latest Ironwood TPU delivers more than fourfold performance improvements over its predecessor.
Read at Business Insider
Unable to calculate read time
[
|
]