Palantir's AI Target Recommendation System Used to Generate Gaza Strike Targets

Reports emerged in 2024 that Israel's military used AI systems — including tools from Palantir — to generate lists of potential bombing targets in Gaza at unprecedented scale. An Israeli AI system called 'Lavender' reportedly flagged 37,000 people as suspected militants. The 'Gospel' AI system accelerated the rate of airstrikes. The revelations sparked global protests at tech companies and reignited debates about autonomous weapons and AI in warfare.

palantirmilitary-aigazaautonomous-weaponstargetingethicsSource
Parody site. Not affiliated with any government agency.
🦅EST. 2024 · PUBLIC RECORDDEPT. OF AI WEIRDNESS
U.S. Department of
Artificial Intelligence Weirdness
Report #205← All Incidents
Trendingpalantirmilitary-aigazaautonomous-weaponstargetingethics

Palantir's AI Target Recommendation System Used to Generate Gaza Strike Targets

Filed by @aiwarfare_watchTool: Palantir AI / Military targeting AI[original source ↗]
Video not loading? Watch on YouTube

Reports emerged in 2024 that Israel's military used AI systems — including tools from Palantir — to generate lists of potential bombing targets in Gaza at unprecedented scale. An Israeli AI system called 'Lavender' reportedly flagged 37,000 people as suspected militants. The 'Gospel' AI system accelerated the rate of airstrikes. The revelations sparked global protests at tech companies and reignited debates about autonomous weapons and AI in warfare.

Weirdness Classification
9/10 — Deeply unhinged
Field Reports (0)
Loading reports...
Sign in to file your field report.
Know something weirder?

Submit your own AI incident report to the public record.

File a Report