IBM Cloud Code Engine, the company's fully managed, strategic serverless platform, has introduced Serverless Fleets with integrated GPU support. With this new capability, the company directly addresses the challenge of running large-scale, compute-intensive workloads such as enterprise AI, generative AI, machine learning, and complex simulations on a simplified, pay-as-you-go serverless model. Historically, as noted in academic papers, including a recent Cornell University paper, serverless technology struggled to efficiently support these demanding, parallel workloads,
( Advanced Micro DevicesNASDAQ:AMD) shocked investors yesterday with a landmark multi-year agreement to supply OpenAI with 6 gigawatts of its Instinct graphics processing units (GPUs), starting with 1 gigawatt in the second half of 2026. The deal, which could generate tens of billions in annual revenue for AMD, includes a warrant allowing OpenAI to acquire up to 160 million shares - roughly 10% of the company - for a nominal fee, tied to deployment milestones and stock price targets up to $600 per share.
On Wednesday, Enterprise AI model-maker Cohere said it raised an additional $100 million - bumping its valuation to $7 billion - in an extension to a round announced in August. The August round was an oversubscribed $500 million round at a $6.8 billion valuation, the company said at the time.
It's a very broad question. But, to start, I think it's important to point out that this is in some respects an entirely new form of business logic and a new form of computing. And so, the first question becomes, if agentic systems are reasoning models coupled with agents that perform tasks by leveraging reasoning models, as well as different tools that have been allocated to them to help them accomplish their tasks ... these models need to run on very high-performance machinery.