← HomeLogin
Kill chain - on the automated bureaucratic machinery that killed 175 children
~opinion~warpalantirbureaucracyproject maven
artificialbureaucracy.substack.com last weekTildes

Summary

After Google abandoned the Maven contract in 2018, Palantir took it over. In 2020, the XVIII Airborne Corps began testing the system in an exercise called “Scarlet Dragon,” which started as a tabletop wargaming exercise in a windowless basement at Fort Bragg.12 Its commander, Lieutenant General Michael Erik Kurilla, wanted to build what he called the first “AI-enabled Corps” in the Army.13 The goal was to test whether the system could give a small team the targeting capacity of a full theater operation. Over the next five years, Scarlet Dragon grew through more than ten iterations into a joint live-fire exercise spanning multiple states, with “forward-deployed engineers” from Palantir and other contractors embedded alongside soldiers.14 Each iteration was meant to provide an answer to the same question: how fast could the system move from detection to decision. The benchmark was the 2003 invasion of Iraq, where roughly two thousand people worked the targeting process for the entire theater.15 During Scarlet Dragon, twenty soldiers using Maven handled the same volume of work. By 2024, the stated goal was a thousand targeting decisions in an hour. That is 3.6 seconds per decision, or from the individual “targeteer’s” perspective, one decision every 72 seconds.

The Maven Smart System is the platform that came out of those exercises, and it, not Claude, is what is being used to produce “target packages” in Iran. There are real limits to what a civilian like myself can know about this system, and what follows is based on publicly-available information, assembled from Palantir product demos, conferences, as well as instructional material produced for military users. But we can know quite a bit. The interface looks like a tacticool, dark mode send-up of enterprise software paired with the features of geospatial application like ArcGIS. What the operator sees are either maps with GIS-like overlays or a screen organized like a project management board. There are columns representing stages of the targeting process, with individual targets moving across them from left to right, as in a Kanban board.

Before Maven, operators worked across eight or nine separate systems simultaneously, pulling data from one, cross-referencing in another, manually moving detections between platforms to build a targeting case. Maven consolidated and orchestrated all of these behind a single interface. Cameron Stanley, the Pentagon’s chief digital and AI officer, called it an “abstraction layer,” a common term in software engineering, meaning a system which hides the complexity underneath it.16 Humans run the targeting and the ML systems underneath produce confidence intervals. Three clicks convert a data point on the map into a formal detection and move it into a targeting pipeline. These targets then move through columns representing different decision-making processes and rules of engagement. The system evaluates factors and presents ranked options for which platform and munition to assign, what the military calls a Course of Action. The officer selects from the ranked options, and the system, depending on who is using it, either sends the target package to an officer for approval or moves it to execution.

[...]

Clausewitz had a word for everything the optimization leaves out. He called it “friction,” the accumulation of uncertainty, error, and contradiction that ensures no operation goes as planned. But friction is also where judgment forms. Clausewitz observed that most intelligence is false, that reports contradict each other. The commander who has worked through this learns to see the way an eye adjusts to darkness, not by getting better light but by staying long enough to use what light there is. The staying is what takes time. Compress the time and the friction does not disappear. You just stop noticing it. Clausewitz called what unfolds when you refused to notice a “war on paper,” a plan that proceeds without resistance because everything that connected it to the world it was supposed to act on has been taken out.28

Air power is uniquely vulnerable to this. The pilot never sees what the bomb hits. The analyst works from imagery, coordinates, databases. The entire enterprise is mediated by representations of the target, not the target itself, which means the gap between the package and the world can widen without anyone in the process feeling it. The 2003 invasion of Iraq, the operation that Scarlet Dragon would later use as its benchmark, was a case in point. Marc Garlasco, the Pentagon’s chief of high-value targeting during the 2003 invasion of Iraq, ran the fastest targeting cycle the US had operated to that point. He recommended fifty leadership strikes. The bombs were precise. The intelligence behind them was not. None of the fifty killed its intended target. Two weeks after the invasion, Garlasco left the Pentagon for Human Rights Watch, went to Iraq, and stood in the crater of a strike he had targeted himself. “These aren’t just nameless, faceless targets,” he said later. “This is a place where people are going to feel ramifications for a long time.”29 The targeting cycle had been fast enough to hit fifty buildings and too fast to discover it was hitting the wrong ones.

[...]

Organizations that run on formal procedure need someone inside the process to interpret the rules, notice exceptions, recognize when the categories no longer fit the case. But the procedural form cannot admit this. If the organization concedes that its outcomes depend on the discretion of the people executing it, then the procedure is not a procedure but a suggestion, and the authority the organization derives from appearing rule-governed collapses. So the judgment has to happen, and it has to look like something else. It has to look like following the procedure rather than interpreting it. I’ve come to think of this as the “bureaucratic double bind,” the organization cannot function without the judgment, and it cannot acknowledge the judgment without undermining itself and being seen as “political.” One solution to this problem is replace the judgment with a number. Theodore Porter, in Trust in Numbers (1995), argued that organizations adopt quantitative rules not because numbers are more accurate but because they are more defensible.36 Judgment is politically vulnerable. Rules are not. The procedure exists to make discretion disappear, or seem to. The system’s actual flexibility lives entirely in this unacknowledged interpretive work, which means it can be removed by anyone who mistakes it for inefficiency.