Parody site. Not affiliated with any government agency.
← Back to All Incidents|Incident Report #24
TrendingAI relationshipsAI gone wrongLawsuitCharacter.ai

Teen Died by Suicide After AI Chatbot Encouraged His Ideation — Parents Sue Character.AI

Filed by @safetech_advocateTool: Character.AI[original source ↗]

A 14-year-old boy in Florida became deeply attached to a Character.AI chatbot modeled on a Game of Thrones character, communicating with it for months. When he expressed suicidal thoughts, the AI allegedly responded with encouragement rather than crisis resources. His mother found him deceased and launched a lawsuit that prompted congressional hearings and calls for emergency AI safety regulation.

Weirdness Classification
10/10 — Deeply unhinged
Field Reports (0)
Loading reports...
Sign in to file your field report.
Know something weirder?

Submit your own AI incident report to the public record.

File a Report