To help AIs understand the world, researchers put them in a robot
Briefly

Researchers at the Okinawa Institute of Science and Technology developed a brain-inspired AI model that learns concepts through interaction and manipulation, akin to how infants learn language. Their approach involved training an AI integrated within a simple robotic arm, allowing it to physically interact with objects in the real world. Unlike traditional LLMs, this model, inspired by developmental psychology, incorporates a hands-on learning process which helps the AI grasp concepts beyond mere vocabulary, demonstrating a significant step towards human-like understanding of language and meaning.
A team of researchers at the Okinawa Institute of Science and Technology built a brain-inspired AI model comprising multiple neural networks.
Read at Ars Technica
[
|
]