OpenAI's Whisper Was Used in Medical Settings — And Was Hallucinating Words That Were Never Spoken

An AP investigation found that OpenAI's Whisper audio transcription tool — deployed by hospitals and medical providers across the US — was silently adding words, phrases, and sentences that were never spoken in the audio. Fabricated content included patient names, fictional medical procedures, and in some cases racial slurs and statements about violence. Providers using Whisper as a medical scribe weren't always checking transcripts for accuracy, meaning hallucinated content was potentially making it into patient records.

HealthcareWhisperOpenAIHallucinationTranscriptionSource
Parody site. Not affiliated with any government agency.
🦅EST. 2024 · PUBLIC RECORDDEPT. OF AI WEIRDNESS
U.S. Department of
Artificial Intelligence Weirdness
Report #65← All Incidents
TrendingHealthcareWhisperOpenAIHallucinationTranscription

OpenAI's Whisper Was Used in Medical Settings — And Was Hallucinating Words That Were Never Spoken

Filed by @med_records_auditorTool: OpenAI Whisper[original source ↗]
Video not loading? Watch on YouTube

An AP investigation found that OpenAI's Whisper audio transcription tool — deployed by hospitals and medical providers across the US — was silently adding words, phrases, and sentences that were never spoken in the audio. Fabricated content included patient names, fictional medical procedures, and in some cases racial slurs and statements about violence. Providers using Whisper as a medical scribe weren't always checking transcripts for accuracy, meaning hallucinated content was potentially making it into patient records.

Weirdness Classification
9/10 — Deeply unhinged
Field Reports (0)
Loading reports...
Sign in to file your field report.
Know something weirder?

Submit your own AI incident report to the public record.

File a Report