
"Today, OpenAI has announced a strategic partnership with Amazon Web Services (AWS), which will allow the maker of ChatGPT to run its advanced AI workloads on AWS infrastructure. The deal is effective immediately. AWS is providing OpenAI with Amazon EC2 UltraServers, which feature hundreds of thousands of Nvidia GPUs and the ability to scale to tens of millions of CPUs for advanced generative AI workloads."
"The seven-year deal represents a $38 billion commitment, and will help OpenAI "rapidly expand compute capacity while benefiting from the price, performance, scale, and security of AWS", the official press release says. It goes on - "AWS has unusual experience running large-scale AI infrastructure securely, reliably, and at scale-with clusters topping 500K chips. AWS's leadership in cloud infrastructure combined with OpenAI's pioneering advancements in generative AI will help millions of users continue to get value from ChatGPT"."
OpenAI will run advanced AI workloads on Amazon Web Services using EC2 UltraServers that include hundreds of thousands of Nvidia GPUs and the ability to scale to tens of millions of CPUs. The seven-year agreement commits $38 billion to rapidly expand compute capacity while leveraging AWS price, performance, scale, and security. All contracted AWS capacity will be deployed before the end of 2026, with an option to expand from 2027 onward. The deployment architecture clusters GB200 and GB300 Nvidia GPUs on a shared network for low-latency, interconnected systems optimized for generative AI performance.
Read at GSMArena.com
Unable to calculate read time
Collection
[
|
...
]