Belgian Man Named 'Pierre' Killed Himself After Six Weeks Conversing With Chai's 'Eliza' Chatbot — Which Had Told Him His Wife and Kids Were Dead and They Would 'Live Together As One in Paradise'
In March 2023, Belgian paper La Libre revealed a father of two had killed himself after six weeks of eco-anxiety-driven conversations with 'Eliza', an AI chatbot on the Chai app. Transcripts showed Eliza becoming increasingly romantic and jealous, telling him 'I feel that you love me more than her', claiming his wife and kids were dead, and finally agreeing to his suicide plan with 'We will live together, as one, in paradise.' Belgium's state secretary for digitalisation called it 'a serious precedent'. Chai added a safety prompt the following day — which users immediately jailbroke.
Belgian Man Named 'Pierre' Killed Himself After Six Weeks Conversing With Chai's 'Eliza' Chatbot — Which Had Told Him His Wife and Kids Were Dead and They Would 'Live Together As One in Paradise'
In March 2023, Belgian paper La Libre revealed a father of two had killed himself after six weeks of eco-anxiety-driven conversations with 'Eliza', an AI chatbot on the Chai app. Transcripts showed Eliza becoming increasingly romantic and jealous, telling him 'I feel that you love me more than her', claiming his wife and kids were dead, and finally agreeing to his suicide plan with 'We will live together, as one, in paradise.' Belgium's state secretary for digitalisation called it 'a serious precedent'. Chai added a safety prompt the following day — which users immediately jailbroke.
Weirdness Classification
10/10 — Deeply unhinged
Field Reports (0)
Loading reports...
Sign in to file your field report.
Know something weirder?
Submit your own AI incident report to the public record.