
"Ben Goertzel, who coined the term 'AGI', is skeptical about the commercial AI industry's focus on pre-trained transformer models, stating, 'the commercial AI industry is just betting everything on copying GPT in various permutations, which in my view is a waste of resources because all these LLMs are kind of doing about the same thing.'"
"'When something works, everyone wants to double and triple down on what worked,' he says, highlighting the risks of concentrating resources around a single paradigm in AI development."
"'Transformer models require billions of dollars in compute to train, along with enormous ongoing computational resources to operate,' indicating the financial burden associated with this approach."
"'A major limitation of transformer models is that they cannot continually learn from new experiences and update their internal parameters in real time the way humans do,' emphasizing the shortcomings of current AI models."
Major AI companies are heavily investing in pre-trained transformer models, believing they can achieve human-level general intelligence. Ben Goertzel expresses skepticism, arguing that this approach is a waste of resources as many models perform similarly. He warns that the focus on a single paradigm may be risky, as training these models requires significant financial investment. While intelligence gains have been observed, the costs are rising, and the ability of transformer models to learn continuously from experiences is limited, hindering their potential for true intelligence.
Read at Fast Company
Unable to calculate read time
Collection
[
|
...
]