
"Meta is in talks with Google about the use of TPUs (Tensor Processing Units) from 2027, with possible lease agreements via Google Cloud as early as 2026. The move would make Google a serious alternative to Nvidia as a supplier of AI hardware. The latter could be significantly affected by this due to its limited number of large customers. According to Bloomberg, citing The Information and insiders, Meta may start using TPUs in its own data centers."
"AI model series creator Claude Anthropic has already signed a deal with Google to lease up to one million TPUs on Google Cloud. It is a somewhat unusual way to express computing power, but the expected power consumption based on that deal will exceed 1 gigawatt by 2026. GPUs were originally designed for graphics processing and are actually quite coincidentally also suitable for generative AI. Their power for massively parallel calculations makes them widely used for AI."
Meta is negotiating with Google to use Tensor Processing Units (TPUs) from 2027, with leases via Google Cloud as early as 2026. The arrangement would position Google as a serious alternative supplier to Nvidia for AI hardware and could reduce Nvidia's leverage given its reliance on a few large customers. Meta and Google Cloud already have a more-than-$10 billion, six-year cloud infrastructure agreement providing Google servers and storage for AI training and inferencing. Those cloud resources addressed capacity shortages while Meta builds data centers. Google's TPUs are application-specific integrated circuits optimized for specific AI workloads, and Google leverages DeepMind feedback to refine TPU designs for generative and agentic AI.
Read at Techzine Global
Unable to calculate read time
Collection
[
|
...
]