Reports emerged in 2024 that Israel's military used AI systems — including tools from Palantir — to generate lists of potential bombing targets in Gaza at unprecedented scale. An Israeli AI system called 'Lavender' reportedly flagged 37,000 people as suspected militants. The 'Gospel' AI system accelerated the rate of airstrikes. The revelations sparked global protests at tech companies and reignited debates about autonomous weapons and AI in warfare.