Amazon releases an impressive new AI chip and teases a Nvidia-friendly roadmap | TechCrunch
Briefly

Amazon releases an impressive new AI chip and teases a Nvidia-friendly roadmap | TechCrunch
"AWS used its annual tech conference to formally launch Trainium3 UltraServer, a system powered by the company's state-of-the art, 3 nanometer Trainium3 chip, as well as its homegrown networking tech. As you might expect, the third-generation chip and system offer big bumps in performance for AI training and inference over the second-generation, according to AWS. AWS says the systems are more than four times faster, with four times more memory, not just for training, but for delivering AI apps at peak demand."
"Additionally, thousands of UltraServers can be linked together to provide an app with up to 1 million Trainium3 chips - 10 times the previous generation. Each UltraServer can host 144 chips, according to the company. Perhaps more importantly, AWS says the chips and systems are also 40% more energy efficient than the previous generation. While the world races to build bigger data centers powered by astronomical gigawatts of electricity, data center giant AWS is trying to make systems that drink less, not more."
AWS launched Trainium3 UltraServer powered by a 3nm Trainium3 chip and proprietary networking, delivering large performance and memory gains for AI training and inference. The systems are more than four times faster and provide four times more memory, enabling better delivery of AI applications at peak demand. Each UltraServer hosts 144 chips, and thousands of UltraServers can link to scale an application to up to 1 million Trainium3 chips. The chips and systems are about 40% more energy efficient than the previous generation, reducing operational power needs and lowering inference costs for customers. Trainium4 is in development and will interoperate with Nvidia chips.
Read at TechCrunch
Unable to calculate read time
[
|
]