There is a serious bottleneck here," AI researcher Tamay Besiroglu... If you start hitting those constraints about how much data you have, then you can't really scale up your models efficiently anymore."
And scaling up models has been probably the most important way of expanding their capabilities and improving the quality of their output," he added.
The controversial trend...publishers, including the New York Times, suing OpenAI over copyright infringement for using their material to train AI models.
The latest paper, authored by researchers at San Francisco-based think tank Epoch, suggests that the sheer amount of text data AI models are being trained on is growing roughly 2.5 times a year.
Collection
[
|
...
]