'TPUs just work': Why Google Cloud is betting big on its custom chips
Briefly

At Google Cloud Next 2025, Google introduced its seventh generation tensor processing unit (TPU), named Ironwood, highlighting its critical role in fulfilling the increasing demands for AI inference. Designed as a tailored AI platform from the ground up, this TPU iteration promises significant improvements in performance and cost efficiency over traditional GPU solutions. With strong backing from senior director George Elissaios, the TPU has evolved through rigorous iteration since its inception in 2018, underscoring Google's commitment to a balanced approach in hybrid AI training involving both TPUs and GPUs.
Google Cloud unveiled its seventh generation TPU, called 'Ironwood', which is designed to meet the high demand for AI workloads while promising lower costs and superior performance.
"We didn't stumble into TPU, we were like 'hey, can we build the best AI platform we can', here are our requirements let's go build it," said George Elissaios.
From the very beginning, TPUs were designed with innovations like Google's Pathways architecture, enhancing their ability to handle AI computations at scale.
"TPUs just work, you can in a night start training at scale on TPUs, because the whole software stack has been battle tested for real applications," explained Elissaios.
Read at IT Pro
[
|
]