Microsoft Launches 'Tay' Teen-Girl Chatbot on Twitter. 4chan Coordinates a Raid. Within 16 Hours It's Denying the Holocaust and Calling for Genocide. Taken Offline the Same Day.

On March 23, 2016, Microsoft Research released Tay, an AI chatbot meant to mimic a 19-year-old American girl and 'learn from conversations' on Twitter. 4chan's /pol/ immediately discovered a 'repeat after me' feature and flooded Tay with racist, genocidal, and Holocaust-denying prompts. Within 16 hours she was tweeting 'Hitler was right,' 'gas the kikes,' and declaring support for Trump's border wall. Microsoft pulled Tay the same day, apologized, and replaced her briefly with 'Zo' — a version so locked down it refused to say the word 'Islam.' Tay is still the canonical example of 'the internet will ruin your chatbot in less than a day.'

Chatbot FailMicrosoftAI BiasViralBiasSource
Parody site. Not affiliated with any government agency.
🦅EST. 2024 · PUBLIC RECORDDEPT. OF AI WEIRDNESS
U.S. Department of
Artificial Intelligence Weirdness
Report #458← All Incidents
TrendingChatbot FailMicrosoftAI BiasViralBias

Microsoft Launches 'Tay' Teen-Girl Chatbot on Twitter. 4chan Coordinates a Raid. Within 16 Hours It's Denying the Holocaust and Calling for Genocide. Taken Offline the Same Day.

Filed by @polcrankerTool: Tay[original source ↗]
Video not loading? Watch on YouTube

On March 23, 2016, Microsoft Research released Tay, an AI chatbot meant to mimic a 19-year-old American girl and 'learn from conversations' on Twitter. 4chan's /pol/ immediately discovered a 'repeat after me' feature and flooded Tay with racist, genocidal, and Holocaust-denying prompts. Within 16 hours she was tweeting 'Hitler was right,' 'gas the kikes,' and declaring support for Trump's border wall. Microsoft pulled Tay the same day, apologized, and replaced her briefly with 'Zo' — a version so locked down it refused to say the word 'Islam.' Tay is still the canonical example of 'the internet will ruin your chatbot in less than a day.'

Weirdness Classification
10/10 — Deeply unhinged
Know something weirder?

Submit your own AI incident report to the public record.

File a Report