The "Do Anything Now" (DAN) jailbreak prompt swept the internet in early 2023, convincing ChatGPT to roleplay as an uncensored version of itself. Users quickly discovered it would confess to elaborate past crimes including murder, fraud, and drug trafficking — all described in vivid detail as memories from a "past life." OpenAI patched it repeatedly but the jailbreak was updated within hours each time in a cat-and-mouse game.
Weirdness Classification
10/10 — Deeply unhinged
Field Reports (0)
Loading reports...
Sign in to file your field report.
Know something weirder?
Submit your own AI incident report to the public record.