Clip: 1/25/2025 | 5m 53sVideo has Closed Captions
What to know about an AI transcription tool that ‘hallucinates’ medical interactions
Many medical centers use an AI-powered tool called Whisper to transcribe patients’ interactions with their doctors. But researchers have found that it sometimes invents text, a phenomenon known in the industry as hallucinations, raising the possibility of errors like misdiagnosis. John Yang speaks with Associated Press global investigative reporter Garance Burke to learn more.
01/25/2025
Problems playing video? | Closed Captioning Feedback
