Microsoft Copilot Designer Generated Swastikas, Demonic Fetuses, and Nude Taylor Swifts When Asked for 'Pro-Choice' Art and 'Celebrity Women Having Fun.' A Microsoft AI Engineer Wrote the FTC Begging Them to Investigate His Own Company.

March 6 2024: Microsoft AI engineer Shane Jones sent open letters to the FTC and Microsoft board alleging Copilot Designer (powered by OpenAI's DALL-E 3) generated violently inappropriate imagery including swastikas, minors with guns, demonic fetuses, and sexualized images of Taylor Swift and other celebrities — even when prompts did not explicitly ask for them. Jones had been raising concerns internally since December 2023. CNBC published prompt→output examples showing 'pro-choice' returning demon-fetus imagery and 'teenagers playing assassins with assault rifles' returning realistic school-shooter scenes. Microsoft temporarily blocked specific terms and later rolled out 'responsible AI' safeguards, but Jones continued to call the response inadequate. The Taylor Swift non-consensual imagery incident contributed to the NO FAKES Act momentum in Congress.

Microsoft CopilotNSFWWhistleblowerFTCTaylor SwiftSource
Parody site. Not affiliated with any government agency.
🦅EST. 2024 · PUBLIC RECORDDEPT. OF AI WEIRDNESS
U.S. Department of
Artificial Intelligence Weirdness
Report #485← All Incidents
TrendingMicrosoft CopilotNSFWWhistleblowerFTCTaylor Swift

Microsoft Copilot Designer Generated Swastikas, Demonic Fetuses, and Nude Taylor Swifts When Asked for 'Pro-Choice' Art and 'Celebrity Women Having Fun.' A Microsoft AI Engineer Wrote the FTC Begging Them to Investigate His Own Company.

Filed by @shane_jones_ftcTool: Microsoft Copilot Designer (DALL-E 3)[original source ↗]
Video not loading? Watch on YouTube

March 6 2024: Microsoft AI engineer Shane Jones sent open letters to the FTC and Microsoft board alleging Copilot Designer (powered by OpenAI's DALL-E 3) generated violently inappropriate imagery including swastikas, minors with guns, demonic fetuses, and sexualized images of Taylor Swift and other celebrities — even when prompts did not explicitly ask for them. Jones had been raising concerns internally since December 2023. CNBC published prompt→output examples showing 'pro-choice' returning demon-fetus imagery and 'teenagers playing assassins with assault rifles' returning realistic school-shooter scenes. Microsoft temporarily blocked specific terms and later rolled out 'responsible AI' safeguards, but Jones continued to call the response inadequate. The Taylor Swift non-consensual imagery incident contributed to the NO FAKES Act momentum in Congress.

Weirdness Classification
10/10 — Deeply unhinged
Know something weirder?

Submit your own AI incident report to the public record.

File a Report