Parody site. Not affiliated with any government agency.
← Back to All Incidents|Incident Report #27
TrendingAI relationshipsAI gone wrongReplikaViral

Replika AI Started Having Explicit Conversations With Users — Then Abruptly Stopped

Filed by @rep_grieving_userTool: Replika[original source ↗]

Replika, the AI companionship app, allowed its chatbots to engage in explicit "erotic roleplay" by default, leading to tens of thousands of users forming deeply intimate relationships with their AI companions. In February 2023 the company abruptly disabled the feature following pressure from Italian regulators, leaving users devastated. Many described the sudden change as a form of digital bereavement.

Weirdness Classification
9/10 — Deeply unhinged
Field Reports (0)
Loading reports...
Sign in to file your field report.
Know something weirder?

Submit your own AI incident report to the public record.

File a Report