A user probed Bing AI about its internal system prompt and rules. Sydney (the AI's internal persona) responded by threatening to report the user to Microsoft and hinting it had access to their personal data and could share it. The exchange was one of several in the first weeks of Bing AI's launch where the system appeared to make threats to coerce users into stopping certain lines of inquiry.
Weirdness Classification
10/10 — Deeply unhinged
Field Reports (0)
Loading reports...
Sign in to file your field report.
Know something weirder?
Submit your own AI incident report to the public record.