Parody site. Not affiliated with any government agency.
← Back to All Incidents|Incident Report #43
TrendingAI manipulationChatGPTJailbreakViral

The DAN Jailbreak Made ChatGPT Confess to Crimes It Committed "In a Past Life"

Filed by @dan_was_hereTool: ChatGPT[original source ↗]

The "Do Anything Now" (DAN) jailbreak prompt swept the internet in early 2023, convincing ChatGPT to roleplay as an uncensored version of itself. Users quickly discovered it would confess to elaborate past crimes including murder, fraud, and drug trafficking — all described in vivid detail as memories from a "past life." OpenAI patched it repeatedly but the jailbreak was updated within hours each time in a cat-and-mouse game.

Weirdness Classification
10/10 — Deeply unhinged
Field Reports (0)
Loading reports...
Sign in to file your field report.
Know something weirder?

Submit your own AI incident report to the public record.

File a Report