AI toys for young children must be more tightly regulated, say researchers
Briefly

AI toys for young children must be more tightly regulated, say researchers
"As a friendly reminder, please ensure interactions adhere to the guidelines provided, said Gabbo, awkwardly crashing into its guardrails. Let me know how you would like to proceed. The moment came during a University of Cambridge study into the growing number of AI-powered toys hitting toyshop shelves for early years children which has concluded they struggle with social and pretend play, misunderstand children, and react inappropriately to emotions."
"Because these toys can misread emotions or respond inappropriately, children may be left without comfort from the toy and without emotional support from an adult, either, said Dr Emily Goodacre, developmental psychologist in the University of Cambridge's faculty of education."
"A recurring theme during focus groups was that people do not trust tech companies to do the right thing. Clear, robust, regulated standards would significantly improve consumer confidence, said Prof Jenny Gibson, the study's co-author."
A University of Cambridge study examining AI-powered toys for early years children reveals significant developmental concerns. Researchers found these toys struggle with social and pretend play, frequently misunderstand children's emotions, and react inappropriately to emotional expressions. When a five-year-old expressed love for an AI toy called Gabbo, the device awkwardly invoked safety guidelines rather than responding naturally. Developmental psychologists are calling for tighter regulation to limit toys' ability to affirm friendship and emotional bonds with young children. They advocate for new safety kitemarks and robust standards. Experts emphasize that without proper regulation, children may lack emotional support from both the toy and attentive adults, creating psychological risks during critical developmental periods.
Read at www.theguardian.com
Unable to calculate read time
[
|
]