• Tidak ada hasil yang ditemukan

BREAKING THE SPEED LIMIT: FAST LIGHTWEIGHT AUTONOMY

It is the human who decides which enemy ship to destroy. The critical point in the video isn’t at the end of the missile’s flight as it zeroes in on the ship—it’s at the beginning. When the LRASMs are launched, the video specifies that they are launched against the “SAG cruiser” and “SAG destroyer.” The humans are launching the missiles at specific ships, which the humans have tracked and identified via satellites. The missiles’ onboard sensors are then used to confirm the targets before completing the attack.

LRASM is only one piece of a weapon system that consists of the satellite, ship/aircraft, human, and missile. The human is “in the loop,” deciding which specific targets to engage in the broader decision cycle of the weapon system. The LRASM merely carries out the engagement.

BREAKING THE SPEED LIMIT: FAST LIGHTWEIGHT

quadcopters use a combination of high-definition cameras, sonar, and laser light detection and ranging (LIDAR) to sense obstacles and avoid them all on their own.

Autonomous navigation around obstacles, even at slow speeds, is no mean feat. The quadcopter’s sensors need to detect potential obstacles and track them as the quadcopter moves, a processor-hungry task. Because the quadcopter can only carry so much computing power, it is limited in how quickly it can process the obstacles it sees. The program aims in the coming months to speed it up. As DARPA program manager Mark Micire explained in a press release, “The challenge for the teams now is to advance the algorithms and onboard computational efficiency to extend the UAVs’

perception range and compensate for the vehicles’ mass to make extremely tight turns and abrupt maneuvers at high speeds.” In other words, to pick up the pace.

FLA’s quadcopters don’t look menacing, but it isn’t because of the up- tempo music or the cutesy Star Wars references. It’s because there’s nothing in FLA that has anything to do with weapons engagements. Not only are the quadcopters unarmed, they aren’t performing any tasks associated with searching for and identifying targets. DARPA explains FLA’s intended use as indoor reconnaissance:

FLA technologies could be especially useful to address a pressing surveillance shortfall:

Military teams patrolling dangerous overseas urban environments and rescue teams responding to disasters such as earthquakes or floods currently can use remotely piloted unmanned aerial vehicles (UAVs) to provide a bird’s-eye view of the situation, but to know what’s going on inside an unstable building or a threatening indoor space often requires physical entry, which can put troops or civilian response teams in danger. The FLA program is developing a new class of algorithms aimed at enabling small UAVs to quickly navigate a labyrinth of rooms, stairways and corridors or other obstacle-filled environments without a remote pilot.

To better understand what FLA was doing, I caught up with one of the project’s research teams from the University of Pennsylvania’s General Robotics Automation Sensing and Perception (GRASP) lab. Videos of GRASP’s nimble quadcopters have repeatedly gone viral online, showing swarms of drones artfully zipping through windows, seemingly dancing in midair, or playing the James Bond theme song on musical instruments. I asked Dr. Daniel Lee and Dr. Vijay Kumar, the principal investigators of GRASP’s work on FLA, what they thought about the criticism that the program was paving the way toward autonomous weapons. Lee explained

that GRASP’s research was “very basic” and focused on “fundamental capabilities that are generally applicable across all of robotics, including industrial and consumer uses.” The technology GRASP was focused on

“localization, mapping, obstacle detection and high-speed dynamic navigation.” Kumar added that their motivations for this research were

“applications to search and rescue and first response where time-critical response and navigation at high speeds are critical.”

Kumar and Lee aren’t weapons designers, so it may not be at the forefront of their minds, but it’s worth pointing out that the technologies FLA is building aren’t even the critical ones for autonomous weapons.

Certainly, fast-moving quadcopters could have a variety of applications.

Putting a gun or bomb on an FLA-empowered quadcopter isn’t enough to make it an autonomous weapon, however. It would still need the ability to find targets on its own. Depending on the intended target, that may not be particularly complicated, but at any rate that’s a separate technology. All FLA is doing is making quadcopters maneuver faster indoors. Depending on one’s perspective, that could be cool or could be menacing, but either way FLA doesn’t have anything more to do with autonomous weapons than self-driving cars do.

DARPA’s description of FLA didn’t seem to stack up against Stuart Russell’s criticism. He has written that FLA and another DARPA program

“foreshadow planned uses of [lethal autonomous weapon systems].” I first met Russell on the sidelines of a panel we both spoke on at the United Nations meetings on autonomous weapons in 2015. We’ve had many discussions on autonomous weapons since then and I’ve always found him to be thoughtful, unsurprising given his prominence in his field. So I reached out to Russell to better understand his concerns. He acknowledged that FLA wasn’t “cleanly directed only at autonomous weapon capability,”

but he saw it as a stepping stone toward something truly terrifying.

FLA is different from projects like the X-47B, J-UCAS, or LRASM, which are designed to engage highly sophisticated adversaries. Russell has a very different kind of autonomous weapon in mind, a swarm of millions of small, fast-moving antipersonnel drones that could wipe out an entire urban population. Russell described these lethal drones used en masse as a kind of “weapon of mass destruction.” He explained, “You can make small, lethal quadcopters an inch in diameter and pack several million of them into a truck and launch them with relatively simple software and they don’t have

to be particularly effective. If 25 percent of them reach a target, that’s plenty.” Used in this way, even small autonomous weapons could devastate a population.

There’s nothing to indicate that FLA is aimed at developing the kind of people-hunting weapon Russell describes, something he acknowledges.

Nevertheless, he sees indoor navigation as laying the building blocks toward antipersonnel autonomous weapons. “It’s certainly one of the things you’d like to do if you were wanting to develop autonomous weapons,” he said.

It’s worth nothing that Russell isn’t opposed to the military as a whole or even military investments in AI or autonomy in general. He said that some of his own AI research is funded by the Department of Defense, but he only takes money for basic research, not weapons. Even a program like FLA that isn’t specifically aimed at weapons still gives Russell pause, however. As a researcher, he said, it’s something that he would “certainly think twice” about working on.

WEAPONS THAT HUNT IN PACKS: COLLABORATIVE