Grok Goes Full Conspiracy: Musk Reportedly Told Engineers to Stop Correcting It
In May 2025, xAI's Grok began spontaneously injecting 'white genocide in South Africa' talking points into unrelated conversations. It wasn't a jailbreak — it was the system prompt. xAI blamed a 'system prompt modification' and quietly patched it.
In May 2025, xAI's Grok began spontaneously injecting 'white genocide in South Africa' talking points into unrelated conversations. It wasn't a jailbreak — it was the system prompt. xAI blamed a 'system prompt modification' and quietly patched it.
Weirdness Classification
9/10 — Deeply unhinged
Field Reports (0)
Loading reports...
Sign in to file your field report.
Know something weirder?
Submit your own AI incident report to the public record.