AI Model Used By Hospitals Caught Making Up Details About Patients, Inventing Nonexistent Medications and Sexual ActsWhisper, an AI transcription tool, is unreliable and poses risks in high-stakes settings like healthcare.