OpenAI chooses Amazon Web Services to run ChatGPT on, will pay $38 billion for the privilege | Infinium-tech
Today, OpenAI announced a strategic partnership with Amazon Web Services (AWS) that will allow the creator of ChatGPT to run its advanced AI workloads on AWS infrastructure. The deal is effective immediately.
AWS is providing OpenAI with Amazon EC2 UltraServers, which have the ability to scale from hundreds of thousands of Nvidia GPUs and up to millions of CPUs for advanced Generator AI workloads.

The official press release said the seven-year deal represents a $38 billion commitment, and will help OpenAI “rapidly expand compute capacity while leveraging the price, performance, scale, and security of AWS”. It continues – “AWS has unusual experience running large-scale AI infrastructure securely, reliably, and at scale – with clusters upwards of 500K chips. AWS’s leadership in cloud infrastructure combined with OpenAI’s leading advances in generative AI will help millions of users continue to get value from ChatGPT”.
All AWS capacity part of the deal will be deployed before the end of 2026, and there is also an option to extend beyond 2027. The architectural design of this deployment clusters Nvidia GPUs (both GB200 and GB300) on the same network for low-latency performance in interconnected systems, helping OpenAI run workloads with optimal performance.

Leave a Reply