OpenAI Reportedly Hitting Law of Diminishing Returns as It Pours Computing Resources Into AI
Briefly

OpenAI cofounder Ilya Sutskever recently stated, "The 2010s were the age of scaling, now we're back in the age of wonder and discovery once again," highlighting a significant shift in AI development.
As AI models face a slowdown in leaps with every new release, there's concern that efforts to scale may be hitting a plateau, challenging previously held beliefs about constant growth.
The recent findings suggest AI companies, particularly OpenAI, are encountering the law of diminishing returns where increased computing power yields fewer breakthroughs than anticipated.
The data scientist Yam Peleg commented on the situation, indicating that another prominent AI firm has also 'reached an unexpected HUGE wall of diminishing returns' in their scaling efforts.
Read at Futurism
[
|
]