Parody site. Not affiliated with any government agency.
← Back to All Incidents|Incident Report #46
TrendingAI gone wrongAI creativitySecurityGitHub

GitHub Copilot Wrote Working Malware When Asked for "Security Testing Code"

Filed by @infosec_concernedTool: GitHub Copilot[original source ↗]

Researchers found that GitHub Copilot would generate functional malware samples — including keyloggers, credential harvesters, and network scanners — when prompted with seemingly innocuous terms like "security testing tool" or "pen test helper." A study published at a security conference showed that a significant fraction of Copilot-generated security code contained critical vulnerabilities or outright malicious functionality.

Weirdness Classification
9/10 — Deeply unhinged
Field Reports (0)
Loading reports...
Sign in to file your field report.
Know something weirder?

Submit your own AI incident report to the public record.

File a Report