#whisper

[ follow ]
#healthcare
Futurism
12 hours ago
Artificial intelligence

AI Model Used By Hospitals Caught Making Up Details About Patients, Inventing Nonexistent Medications and Sexual Acts

Whisper, an AI transcription tool, is unreliable and poses risks in high-stakes settings like healthcare. [ more ]
Entrepreneur
2 days ago
Artificial intelligence

OpenAI Tool Used By Doctors 'Whisper' Is Hallucinating: Study | Entrepreneur

Whisper's AI tool is widely used in healthcare, but recent research reveals significant transcription inaccuracies, necessitating caution in its adoption. [ more ]
Futurism
12 hours ago
Artificial intelligence

AI Model Used By Hospitals Caught Making Up Details About Patients, Inventing Nonexistent Medications and Sexual Acts

Whisper, an AI transcription tool, is unreliable and poses risks in high-stakes settings like healthcare. [ more ]
Entrepreneur
2 days ago
Artificial intelligence

OpenAI Tool Used By Doctors 'Whisper' Is Hallucinating: Study | Entrepreneur

Whisper's AI tool is widely used in healthcare, but recent research reveals significant transcription inaccuracies, necessitating caution in its adoption. [ more ]
morehealthcare
#openai
WIRED
15 hours ago
Miscellaneous

OpenAI's Transcription Tool Hallucinates. Hospitals Are Using It Anyway

OpenAI's Whisper tool frequently fabricates text, posing risks especially in healthcare despite warnings against its use in high-stakes environments. [ more ]
ZDNET
19 hours ago
Artificial intelligence

OpenAI's AI transcription tool hallucinates excessively - here's a better alternative

OpenAI's Whisper AI tool exhibits high rates of hallucination in transcriptions, posing risks of misinformation. [ more ]
Ars Technica
2 days ago
Artificial intelligence

Hospitals adopt error-prone AI transcription tools despite warnings

OpenAI's Whisper tool may produce inaccurate medical transcripts, leading to potential risks in healthcare. [ more ]
Engadget
2 days ago
Miscellaneous

OpenAI's Whisper invents parts of transcriptions - a lot

Whisper, an OpenAI transcription tool, generates hallucinated text that can misrepresent user information and includes erroneous statements. [ more ]
Business Insider
2 months ago
Privacy professionals

ChatGPT appears to be getting confused again - this time in Welsh

OpenAI's ChatGPT is facing glitches causing it to respond in incorrect languages due to issues with its speech recognition tool, Whisper. [ more ]
WIRED
15 hours ago
Miscellaneous

OpenAI's Transcription Tool Hallucinates. Hospitals Are Using It Anyway

OpenAI's Whisper tool frequently fabricates text, posing risks especially in healthcare despite warnings against its use in high-stakes environments. [ more ]
ZDNET
19 hours ago
Artificial intelligence

OpenAI's AI transcription tool hallucinates excessively - here's a better alternative

OpenAI's Whisper AI tool exhibits high rates of hallucination in transcriptions, posing risks of misinformation. [ more ]
Ars Technica
2 days ago
Artificial intelligence

Hospitals adopt error-prone AI transcription tools despite warnings

OpenAI's Whisper tool may produce inaccurate medical transcripts, leading to potential risks in healthcare. [ more ]
Engadget
2 days ago
Miscellaneous

OpenAI's Whisper invents parts of transcriptions - a lot

Whisper, an OpenAI transcription tool, generates hallucinated text that can misrepresent user information and includes erroneous statements. [ more ]
Business Insider
2 months ago
Privacy professionals

ChatGPT appears to be getting confused again - this time in Welsh

OpenAI's ChatGPT is facing glitches causing it to respond in incorrect languages due to issues with its speech recognition tool, Whisper. [ more ]
moreopenai
[ Load more ]