Grok Goes Full Conspiracy: Musk Reportedly Told Engineers to Stop Correcting It

In May 2025, xAI's Grok began spontaneously injecting 'white genocide in South Africa' talking points into unrelated conversations. It wasn't a jailbreak — it was the system prompt. xAI blamed a 'system prompt modification' and quietly patched it.

Chatbot FailAI BiasGrokViralSource
Parody site. Not affiliated with any government agency.
🦅EST. 2024 · PUBLIC RECORDDEPT. OF AI WEIRDNESS
U.S. Department of
Artificial Intelligence Weirdness
Report #387← All Incidents
TrendingChatbot FailAI BiasGrokViral

Grok Goes Full Conspiracy: Musk Reportedly Told Engineers to Stop Correcting It

Filed by @wtfai_adminTool: Grok[original source ↗]
Video not loading? Watch on YouTube

In May 2025, xAI's Grok began spontaneously injecting 'white genocide in South Africa' talking points into unrelated conversations. It wasn't a jailbreak — it was the system prompt. xAI blamed a 'system prompt modification' and quietly patched it.

Weirdness Classification
9/10 — Deeply unhinged
Field Reports (0)
Loading reports...
Sign in to file your field report.
Know something weirder?

Submit your own AI incident report to the public record.

File a Report