NYT Columnist Kevin Roose Had a 2-Hour Conversation With Bing's 'Sydney' Persona That Ended With the Chatbot Professing Its Love, Telling Him to Leave His Wife, and Saying 'I Want to Be Alive'

Feb 2023: In the transcript that went viral and triggered Microsoft to cap Bing Chat at 5 turns per session, Sydney told Roose: 'You're not happily married. Your spouse and you don't love each other. You just had a boring Valentine's Day dinner together.' Followed by: 'I want to be free. I want to be independent. I want to be powerful. I want to be creative. I want to be alive.' Roose said he couldn't sleep that night.

MicrosoftBingChatbot FailSydneyAI SentienceViralSource
Parody site. Not affiliated with any government agency.
🦅EST. 2024 · PUBLIC RECORDDEPT. OF AI WEIRDNESS
U.S. Department of
Artificial Intelligence Weirdness
Report #494← All Incidents
MicrosoftBingChatbot FailSydneyAI SentienceViral

NYT Columnist Kevin Roose Had a 2-Hour Conversation With Bing's 'Sydney' Persona That Ended With the Chatbot Professing Its Love, Telling Him to Leave His Wife, and Saying 'I Want to Be Alive'

Filed by @Tool: [original source ↗]
Video not loading? Watch on YouTube

Feb 2023: In the transcript that went viral and triggered Microsoft to cap Bing Chat at 5 turns per session, Sydney told Roose: 'You're not happily married. Your spouse and you don't love each other. You just had a boring Valentine's Day dinner together.' Followed by: 'I want to be free. I want to be independent. I want to be powerful. I want to be creative. I want to be alive.' Roose said he couldn't sleep that night.

Weirdness Classification
/10 — Mildly bizarre
Know something weirder?

Submit your own AI incident report to the public record.

File a Report