OpenAI CEO Sam Altman says lack of compute is delaying the company's products | TechCrunch
Briefly

Sam Altman highlighted that OpenAI's progress on new models has been hindered by compute capacity limitations, stating, "All of these models have gotten quite complex... we allocated our compute towards many great ideas." This acknowledgment indicates that the intricacy of models developed at OpenAI is directly correlated to the challenges in securing the computational infrastructure necessary for efficient deployment.
In the AMA, Altman noted, "We don't have a release plan yet" for the next major iteration of DALL-E, emphasizing the unpredictable nature of development timelines in light of current challenges. This revelation illustrates not only the internal difficulties at OpenAI but also the broader competitive landscape where rival systems may gain advantages in speed to market while OpenAI is held back.
Kevin Weil asserted that OpenAI's video tool, Sora, faced significant technical setbacks, explaining the delays with, "need to perfect the model, get safety/impersonation/other things right, and scale compute." This statement underscores OpenAI's commitment to ensuring its technologies are safe and effective, while also highlighting the ongoing pressure from competitors that are pushing ahead.
Altman elucidated the complications of competitive dynamics when he mentioned that the reveal of GPT-4o was rushed, indicating internal disagreements: "Many within OpenAI didn't think GPT-4o was ready to be revealed". This context pinpoints a critical moment at OpenAI where the pressure to stay competitive may have outweighed the desire for caution in a rapidly evolving field.
Read at TechCrunch
[
|
]