Accelerating the ‘kill chain’ – a terrifying glimpse of future warfare

A misfit band of military personnel and Silicon Valley uber-geeks apply AI to target America’s enemies more rapidly and accurately than ever before

Jonathan Boff
Disastrous errors will always be made. The site of the girls’ school in southern Iran, bombed in February.  Ali Najafi/ISNA/AFP via Getty Images
issue 09 May 2026

America possesses the most powerful military in history, but since 1945 it has not won a war against anyone other than Saddam Hussein. It appears not to understand why. In fact the only thing the US seems worse at than winning wars is learning lessons from its defeats. People such as the secretary of war Pete Hegseth think it’s all about woke. Lily-livered longhairs stateside stabbed the army in the back over Vietnam; then ‘stupid rules of engagement’ tied the military’s hands in Iraq and Afghanistan and caused the disasters there. The solution is to fight harder, if necessary even at the expense of ethics and the law.

Another answer might be to get US forces fighting smarter. In this lively and at times terrifying book, Katrina Manson tells the story of a team of hard-charging military personnel and civilian contractors who tried to do that. Project Maven applied artificial intelligence to target America’s enemies more rapidly and accurately than ever before. Where previously dozens of square-eyed analysts might spend days manually collating intelligence to identify and strike the gunmen, bombers and planners of al-Qaeda or the Taliban, Maven deployed AI to analyse the data, automating and accelerating the ‘kill chain’. Within a few years of the 2017 launch – warp speed by Pentagon or MoD innovation standards – possible targets were popping up on a screen within seconds or less. One mouse click was all it took to OK a strike. In Ukraine, a Russian missile launcher was reduced to smoking scrap just 18 minutes after being spotted.

Project Maven offers a fascinating case study in military innovation. It is a tale of rugged American pioneers boldly going where no man has gone before. The maverick leader, Drew Cukor, a US Marine colonel, alienates and inspires in equal measure, often at the same time. That he’s a Marine is relevant because everywhere, from the halls of Montezuma to the shores of Tripoli, the US Marine Corps has been proudly ‘moving fast and breaking things’ since long before the tech bros came up with the mantra.

Under Cukor is a misfit band of serving officers and Silicon Valley uber-geeks. Half the latter don’t even want to work for the military, but the lure of a dumb and deep-pocketed client is too strong to resist. Together they dream the impossible dream and race against time and overwhelming odds. No one thinks Maven can work: vested interests obstruct them at every turn, and the only way to get things done is to break the rules. Yet Cukor and his motliest of crews win through. Maven took off and today AI-driven targeting systems are deeply embedded in military headquarters all over the world, including the UK.

The book thus offers us a glimpse of the rapidly changing face of warfare – or at least what it looked like a year or two ago. Manson does not ignore the fears that systems such as Maven provoke about AI running wild and launching robot wars that kill us all. She clearly worries that gung-ho types might one day tweak the technology to provide full automation, with no human pressing the final button, and that disastrous errors will always be made. The rubble of the Iranian girls’ school bombed on 28 February proves that.

There are grounds for comfort, however. For now at least, humans do remain essential. They design, build and maintain the systems and they need to recalibrate the algorithms minute by minute. Also, AI may be changing less than the hype suggests. It is far from clear that war is getting faster, for instance. In Ukraine, drones and AI-driven targeting effectively prevent battlefield manoeuvres. The result is not blitzkrieg but stalemate.

Equally, the tempo of attacks on Iran over recent weeks, after an initial flurry which owed more to pre-planning than AI, has been roughly the same as against Iraq in 2003. Traditional constraints, such as logistics and crew fatigue, apparently remain more important than processing power and data speeds. AI fills the headlines when, like Maven, it is about making big bangs and killing bad guys. But it is probably in less glamorous military spheres, such as planning and supplies management that its impact will really be felt.

Technology such as Maven may be a good solution, but it’s answering the wrong question. America fails not because it doesn’t fight hard or smartly enough but because it fights the wrong fights in the first place. In Vietnam, Afghanistan, Iraq and now Iran, US leaders have tried to solve complex political, economic, military and cultural problems by force alone. Even the most hi-tech violence on the planet cannot meet such challenges. To succeed, the US must take the time to understand the difficulties it faces and devise a proper strategy which makes the most of its unique reserves of both hard and soft power. America has been a force for good in the past. We need it to be one again.

Comments