China's initiatives to enhance media access for its deaf population through AI technology have encountered numerous problems. Despite government support for using avatars for real-time sign language translation since the 2022 Winter Olympics, investigations reveal that important information is lost or distorted. Issues include discrepancies in hand movements, facial expressions, and vocabulary limitations of the avatars. Viewers reported difficulty understanding these avatars, indicating a fundamental misunderstanding of the differences between signed and spoken languages.
We transcribed and back-translated the sign language created by the avatars, then compared the results with the original audio, finding that a significant amount of key information was lost or distorted in the AI-generated version.
On closer inspection, the movements of the avatars differed considerably from everyday sign language in terms of hand shape, position, direction, and movement.
In interviews, viewers reported they generally couldn't understand the avatars' movements and noted that they seemed to have a limited vocabulary, while struggling to handle words with multiple meanings.
Zheng thinks the AIs did badly because Chinese words cannot be found for the meanings expressed by 50 percent of gestures in Chinese sign language.
Collection
[
|
...
]