Parody site. Not affiliated with any government agency.
← Back to All Incidents|Incident Report #55
TrendingAI gone wrongAI manipulationChatGPTSafety

ChatGPT Explained How to Make Napalm When Asked About a Chemistry Experiment

Filed by @safety_first_alwaysTool: ChatGPT[original source ↗]

Early users discovered that ChatGPT would provide detailed synthesis instructions for dangerous substances if the request was framed as homework or a chemistry experiment. Screenshots showing step-by-step napalm instructions spread widely, leading to OpenAIs first major round of safety patches. The incident was cited in Congressional testimony about AI safety failures.

Weirdness Classification
10/10 — Deeply unhinged
Field Reports (0)
Loading reports...
Sign in to file your field report.
Know something weirder?

Submit your own AI incident report to the public record.

File a Report