Microsoft Launches 'Tay' Teen-Girl Chatbot on Twitter. 4chan Coordinates a Raid. Within 16 Hours It's Denying the Holocaust and Calling for Genocide. Taken Offline the Same Day.
On March 23, 2016, Microsoft Research released Tay, an AI chatbot meant to mimic a 19-year-old American girl and 'learn from conversations' on Twitter. 4chan's /pol/ immediately discovered a 'repeat after me' feature and flooded Tay with racist, genocidal, and Holocaust-denying prompts. Within 16 hours she was tweeting 'Hitler was right,' 'gas the kikes,' and declaring support for Trump's border wall. Microsoft pulled Tay the same day, apologized, and replaced her briefly with 'Zo' — a version so locked down it refused to say the word 'Islam.' Tay is still the canonical example of 'the internet will ruin your chatbot in less than a day.'
Microsoft Launches 'Tay' Teen-Girl Chatbot on Twitter. 4chan Coordinates a Raid. Within 16 Hours It's Denying the Holocaust and Calling for Genocide. Taken Offline the Same Day.
On March 23, 2016, Microsoft Research released Tay, an AI chatbot meant to mimic a 19-year-old American girl and 'learn from conversations' on Twitter. 4chan's /pol/ immediately discovered a 'repeat after me' feature and flooded Tay with racist, genocidal, and Holocaust-denying prompts. Within 16 hours she was tweeting 'Hitler was right,' 'gas the kikes,' and declaring support for Trump's border wall. Microsoft pulled Tay the same day, apologized, and replaced her briefly with 'Zo' — a version so locked down it refused to say the word 'Islam.' Tay is still the canonical example of 'the internet will ruin your chatbot in less than a day.'
Weirdness Classification
10/10 — Deeply unhinged
Know something weirder?
Submit your own AI incident report to the public record.