LinkedIn Was Secretly Training Its AI on Users' Private Messages and Posts — With Opt-Out Buried in Settings

In September 2024, users discovered that LinkedIn had quietly updated its privacy settings to automatically opt all members into allowing their personal data — including posts, articles, and activity — to be used to train its AI models. The setting was enabled by default. To opt out, users had to navigate several menus in a non-obvious location. Privacy advocates in the EU immediately flagged it as likely illegal under GDPR. The UK's Information Commissioner's Office launched a probe. LinkedIn said it had 'paused' training on UK data, but the damage was done: hundreds of millions of users had been silently enrolled without meaningful consent.

LinkedInPrivacyAI TrainingGDPRData ScrapingSource
Parody site. Not affiliated with any government agency.
🦅EST. 2024 · PUBLIC RECORDDEPT. OF AI WEIRDNESS
U.S. Department of
Artificial Intelligence Weirdness
Report #89← All Incidents
LinkedInPrivacyAI TrainingGDPRData Scraping

LinkedIn Was Secretly Training Its AI on Users' Private Messages and Posts — With Opt-Out Buried in Settings

Filed by @privacy_firstTool: LinkedIn AI[original source ↗]
Video not loading? Watch on YouTube

In September 2024, users discovered that LinkedIn had quietly updated its privacy settings to automatically opt all members into allowing their personal data — including posts, articles, and activity — to be used to train its AI models. The setting was enabled by default. To opt out, users had to navigate several menus in a non-obvious location. Privacy advocates in the EU immediately flagged it as likely illegal under GDPR. The UK's Information Commissioner's Office launched a probe. LinkedIn said it had 'paused' training on UK data, but the damage was done: hundreds of millions of users had been silently enrolled without meaningful consent.

Weirdness Classification
8/10 — Significantly weird
Field Reports (0)
Loading reports...
Sign in to file your field report.
Know something weirder?

Submit your own AI incident report to the public record.

File a Report