A researcher at the University of Michigan found hallucinations in eight out of every ten audio transcriptions produced by Whisper during a study of public meetings, raising significant concerns about accuracy.
Experts noted that while it is common for AI transcribers to err occasionally, the level of hallucination observed with Whisper is unprecedented compared to other AI transcription tools.
OpenAI claims Whisper 'approaches human level robustness and accuracy' on English speech recognition, yet its incorporation into various industries raises alarm about spreading misinformation.
With Whisper's widespread integration into platforms used for transcribing interviews and generating subtitles, the potential for disseminating fabricated text and misinformation is exceptionally alarming.
Collection
[
|
...
]