'Sorry, I didn't get that': AI misunderstands some people's words more than others
Briefly

The concept of humanlike AI assistants has evolved since the film 'Her.' Generative AI tools such as ChatGPT, Siri, and Alexa are now common, but they still lag in understanding human speech. Research shows these systems perform poorly with non-native accents, African American Vernacular English, and various demographics. Unlike empathetic human listeners, these systems cannot interpret intonation or gestures, resulting in communication barriers. As more organizations implement these tools, users face challenges, particularly in essential services like healthcare and emergency response, highlighting significant limitations in existing technology.
Unlike you or me, automatic speech recognition systems are not what researchers call 'sympathetic listeners.' Instead of trying to understand you by taking in other useful clues like intonation or facial gestures, they simply give up.
Twelve years on, this is no longer the stuff of science fiction. Generative AI tools like ChatGPT and digital assistants like Apple's Siri and Amazon's Alexa help people get driving directions, make grocery lists, and plenty else.
Read at TNW | Deep-Tech
[
|
]