The U.S. Military Is Betting On "Smart" Drones — Lots And Lots Of Them

The U.S. military has a vision for the future: swarms of drones controlled not by humans, but computers. The Air Force says it has an “insatiable demand” for drones.

ARLINGTON, Virginia — In the Pentagon briefing room, filled with requisite overstuffed chairs surrounding a heavy wooden table, the future of drone warfare flashed on a large screen.

The depiction wasn’t a single, ominous glider high over a training camp in Afghanistan, or another flashpoint in the long war on Al Qaeda. Instead it was dozens of synchronized drones, looking like a swarm of wasps, slicing through the layered air defenses of an enemy, unnamed but possibly China or Russia. Perhaps most alarming, those drones were “intelligent” — equipped with artificial intelligence software to do a lot of thinking, and fighting, without any human intervention.

This is how the U.S. military envisions war 20 years from now. Since a CIA drone killed two Taliban guards on the first day of the Afghanistan War in 2001, grounded pilots using remote controls have flown single drones and pulled their triggers. Now, after endless controversy over their use hunting for terrorists, drones are poised to become even more important, military officials say, spurred by advances in robotic intelligence that will let them fly themselves.

“It will be machines against machines in warfare,” U.S. Air Force Col. Brandon Baker said at the May Pentagon briefing. Swarms of the robot aircraft will effectively fly themselves, with people simply setting up the plans in advance. “You will tell them where to go and they will fly there by themselves,” maneuvering and evading air defenses on their own.

Currently, U.S. drones are steered by a single operator on the ground, who also makes any decisions about shooting someone or dropping a bomb, Baker said. In the future, the vision is for swarms of smarter drones to search out such targets for themselves, though a human would still make any lethal decisions.

In late May, the Air Force released a new “Flight Plan” report for drones through 2036, which described an “insatiable demand” for drones from the military, and sees them as the eventual replacements for human pilots.

The end game is for swarms of cheap drones to work together to replace manned, costly aircraft. Each drone would be cheap and dumb by itself, but working together like the members of a bee swarm, they would possess a group intelligence that would outsmart the enemy. Doing the work of today’s expensive manned aircraft, particularly on “dull, dirty, or dangerous,” missions, the loss of a few cheap drones from a swarm wouldn’t be a catastrophe.

“Removing the pilot from the aircraft opens up a whole set of design aircraft choices,” the report said. “We have only begun to understand the possibilities.”

The Defense Department has requested $4.6 billion in drone funding for next year, according to Bard College’s Center For The Study Of The Drone. The big-ticket items include existing drone models, such as the MQ-9 Reaper, which carries bombs and missiles, and the RQ-4 Global Hawk, a surveillance drone.

But nearly half of the drone money, about $2.1 billion, is slated for research and development. The Navy is pursuing a drone that will refuel other aircraft. The Army is spending $12.4 million on drones made with a 3-D printer, and roughly $15 million researching how artificially intelligent drones might work together and with manned aircraft, with similar efforts under study at the Air Force, the Office of the Secretary of Defense, and the Defense Advanced Research Projects Agency (DARPA).

Two decades from now, Col. Baker envisions swarms of about a dozen drones that would each pack a different sensor or weapon for attacking targets. Most provocatively, these targets would be chosen by the drone, based on image recognition software, rather than a human pilot.

On other missions, human pilots flying airplanes might exercise control over drones that act as their “loyal wingmen,” helpers equipped with special weapons or sensors that they otherwise lack.

Almost as important as their performance, Baker said, an expendable drone equipped with improved missiles or radars would be cheaper and faster to build, instead of incorporating technology into aircraft such as the F-35 fighter jet, which costs $100 million or more, or the planned B-21 bomber, with an estimated $550 million purchase price, and a lifetime cost that might reach a billion dollars. For perspective, a Reaper drone costs only $16 million.

As the Air Force flight plan put is: “A $1 billion bomber is probably not expendable under most conditions, regardless of whether there are people onboard or not.”

Drone programs at DARPA, the Pentagon’s high-risk technology testbed, have sought to tap into the expanding commercial drone world by testing off-the-shelf ones. In February, souped-up hobbyist drones in the research agency’s “Fast Lightweight Autonomy” program steered themselves around obstacles at a Massachusetts warehouse, flying as fast as 45 miles-per-hour.

View this video on YouTube

DARPA’s $29 million “Collaborative Operations in Denied Environment (CODE)“ effort would build software for wingmen drones, and another $36 million program would launch volleys of “gremlin” drones — “named for the imaginary, mischievous imps that became the good luck charms of many British pilots during World War II” — from a cargo aircraft. That sounds a lot like swarms.

All the military enthusiasm for drones is real but it’s not happening tomorrow, robotic expert Vijay Kumar of the University of Pennsylvania told BuzzFeed News. “The state of the art now is getting about four drones to fly in formation.”

Making the Pentagon’s plans a reality is largely a software problem, he added, much more complex than building self-driving cars to navigate in two dimensions without running into each other. “We’re definitely not there yet,” he said.

Although Kumar says that letting machines independently target people is “very dangerous,” he agrees with Baker’s long-term vision of warfare: “My machines will take on your machines, and when my machines win, you will have to surrender,” he said. “Maybe in that way there will be less killing in warfare.”

This spring, the United Nations held a meeting in Geneva on setting up rules for warfare with “lethal autonomous weapons systems” (its third since 2014). U.S. representatives there cited a 2012 Defense Department directive that said intelligent machines couldn’t select people as targets, which would still allow them to attack radar antennas or other drones.

“Whenever we talk about this, usually I hear killer robots come up,” Deputy Secretary of Defense Robert Work said in an April speech in Brussels that outlined U.S. thinking on lethal machines equipped with artificial intelligence. “That's not the way this will work.”

Not everyone, however, is so optimistic about the warfare in an age of intelligent machines.

“Autonomous weapons are a salient point of departure in a technology-fueled arms race that puts everyone in danger,” wrote Mark Gubrud, of the Curriculum in Peace, War, and Defense at the University of North Carolina, this month in IEEE: Spectrum. “That is why I believe we need to ban them as fast and as hard as we possibly can.”

Skip to footer