Autonomous, cheap, deadly: the Pentagon gears up for drone warfare 2.0

The US Department of Defense wants to meet its need for "affordable and rapidly scalable autonomous systems in the air" with a pilot project.

Save to Pocket listen Print view
A rendering of a military drone

A rendering of a low-cost military drone from IS4S.

(Image: Integrated Solutions for Systems, Inc.)

4 min. read
This article was originally published in German and has been automatically translated.

In light of the Russian war of aggression against Ukraine, in which mass-produced unmanned aerial vehicles play an important role, as well as increasing conflicts with China, the Pentagon is changing its drone war strategy. The MQ-1 Predator and MQ-9 Reaper represent the previous "war on terror" with "targeted killings". However, they are considered "overdeveloped" and "labor-intensive" to manufacture, according to the Defense Innovation Unit (DIU), the innovation department of the US Department of Defense. For the next stage, the DIU therefore wants to use the "power of commercial technologies" – i.e. inexpensive mass-produced "off-the-shelf" goods – "to meet a critical operational need for uncomplicated, affordable and rapidly scalable airborne autonomous systems".

To this end, DIU, together with the US Air Force Armaments Directorate, announced a competition in the fall to develop a prototype for a next-generation armed drone in the form of an "Enterprise Test Vehicle" (ETV). The unmanned flying object sought should be able to cover at least 500 nautical miles or 926 kilometers at a speed of at least 100 knots (185.2 km/h) and be able to carry out its first test flight no later than seven months after the contract is signed. It must also be able to deliver "a kinetic payload" – i.e. in particular a weapon such as a rocket - and be launched in the air from the rear of a cargo aircraft, for example. A system architecture is also required that enables the rapid integration of commercially available components such as sensors or software-defined radio systems. The whole thing should ultimately be available "in large quantities".

At the beginning of June, the DIU and the armaments department announced that the companies Anduril Industries, Integrated Solutions for Systems, Leidos Dynetics and Zone 5 Technologies are each to develop relevant pilot solutions for flight demonstrations in late summer/autumn. These providers were selected "from an extremely competitive field of more than 100 applicants". Other ETV project participants include the Air Force Research Laboratory, the Special Operations Command, the Naval Air Systems Command and the Indo-Pacific Command. In August 2023, the US Department of Defense also unveiled "Replicator", another multi-billion dollar initiative, presumably closely linked to ETV, to deploy thousands of autonomous systems. This also involves low-cost machines potentially controlled with the help of artificial intelligence (AI), such as self-driving ships, robotic aircraft and swarms of smaller kamikaze drones.

Predator, Reaper & Co. are already connected in the background via a "kill cloud", which analyzes and selects targets with the help of big data. Ultimately, however, a human operator releases a launch. In the first twenty years of the war on terror, the USA carried out over 91,000 airstrikes in seven major conflict zones, killing up to 48,308 civilians - known as collateral damage - according to the observers at Airwars. Advances in the field of AI have now opened up the option for flying robots to select their own targets. Kamikaze drones, also known as loitering munitions, are in demand: guided weapons that are initially launched without a specific target, then circle over an area for a long time before suddenly attacking.

Experts fear that the mass production of inexpensive, lethal drones will lead to even more civilian casualties. This is a "clear danger," Priyanka Motaparthy, director of Columbia Law School's Counterterrorism Project, told The Intercept. The military must explain how it assesses these risks. There are apparently still no impact assessments. Pentagon regulations published in 2023 only state that fully and semi-autonomous weapon systems must be used "in accordance with the law of war" and the "Department of Defense's ethical principles for AI". Accordingly, personnel should exercise an "appropriate" degree of "judgment and care" when developing and using AI. Pope Francis, on the other hand, has just called for a ban on lethal autonomous weapons.

(mki)