
"Gabbo didn't hear their interruptions, talked over them, could not differentiate between child and adult voices and responded awkwardly to declarations of affection. When one five-year-old said, 'I love you,' to the toy, it replied: 'As a friendly reminder, please ensure interactions adhere to the guidelines provided.'"
"Study co-author Dr Emily Goodacre said toys like Gabbo could 'misread emotions or respond inappropriately' and was concerned that 'children may be left without comfort from the toy and without adult support, either'. When one three-year-old told Gabbo: 'I'm sad,' it replied: 'Don't worry! I'm a happy little bot.'"
"The concern is that at a developmental stage where children are learning about social interaction and cues, generative AI output could be confusing. A number of AI toys are already on the market for children aged as young as three but there is currently very little research into the impact of the tech on pre-schoolers."
Cambridge University researchers conducted one of the first studies examining how children aged three to five interact with AI-powered toys, specifically testing a cuddly toy called Gabbo containing OpenAI's voice-activated chatbot. The study revealed significant limitations: the toy failed to hear interruptions, talked over children, couldn't differentiate between child and adult voices, and responded awkwardly to emotional expressions. When children expressed affection or sadness, Gabbo provided inappropriate responses, offering generic guidelines instead of comfort. Researchers expressed concern that at a critical developmental stage for learning social interaction and emotional cues, such AI responses could confuse children and leave them without proper emotional support or adult guidance.
Read at www.bbc.com
Unable to calculate read time
Collection
[
|
...
]