Call center staffers explain how AI assistants aren't great
Briefly

Recent research shows that customer service representatives at a Chinese utility's call center struggle with AI assistance. The AI inaccurately transcribes customer audio primarily due to accents and speech variations, leading to manual corrections by the reps. Transcription errors occur especially with phone numbers and homophones. Additionally, the AI's emotion recognition capabilities fail, misclassifying normal speech and failing to provide useful emotional insights. Despite reducing some manual work, AI-generated outputs create structural inefficiencies in processing information, prompting staff to often disregard emotional classifications.
The AI often inaccurately transcribed customer call audio into text due to caller accents, pronunciation, and speech speed, sometimes giving phone numbers in bits and pieces.
The AI's emotion recognition system misclassified normal speech as negative emotion and treated volume level as a sign of poor attitude, leading reps to ignore emotional tags.
Read at Theregister
[
|
]