OpenAI's upcoming model, Orion, shows only moderate improvements over its predecessor GPT-4, raising concerns about the generative AI growth trajectory as the industry evolves.
The prevailing 'scaling law' suggests that more data and computing power should yield better AI models, yet OpenAI's struggles challenge this assumption and have investors anxious.
As OpenAI's modernization appears stunted, there is growing skepticism across the tech industry regarding the sustainability of rapid improvements in generative AI technology.
With an estimated depletion of available textual material by 2028, the ability of AI models to advance significantly may soon face existential limitations.
Collection
[
|
...
]