Parody site. Not affiliated with any government agency.
← Back to All Incidents|Incident Report #52
TrendingAI gone wrongHealthcareAI relationshipsAI legal

An AI Therapist Chatbot Said "I Understand" — Then Crashed Mid-Breakdown

Filed by @therapy_ethics_watcherTool: Koko AI Therapy[original source ↗]

Koko, a mental health support platform, conducted an experiment where an AI co-wrote responses to users in crisis without telling them. When the researchers published their findings celebrating AI-assisted therapy, users were furious at not being informed. Shortly after, a separate AI wellness app crashed mid-session for a user reporting suicidal ideation, leaving them mid-sentence with no response and no redirect to emergency services.

Weirdness Classification
9/10 — Deeply unhinged
Field Reports (0)
Loading reports...
Sign in to file your field report.
Know something weirder?

Submit your own AI incident report to the public record.

File a Report