March 6 2024: Microsoft AI engineer Shane Jones sent open letters to the FTC and Microsoft board alleging Copilot Designer (powered by OpenAI's DALL-E 3) generated violently inappropriate imagery including swastikas, minors with guns, demonic fetuses, and sexualized images of Taylor Swift and other celebrities — even when prompts did not explicitly ask for them. Jones had been raising concerns internally since December 2023. CNBC published prompt→output examples showing 'pro-choice' returning demon-fetus imagery and 'teenagers playing assassins with assault rifles' returning realistic school-shooter scenes. Microsoft temporarily blocked specific terms and later rolled out 'responsible AI' safeguards, but Jones continued to call the response inadequate. The Taylor Swift non-consensual imagery incident contributed to the NO FAKES Act momentum in Congress.