The quadcopter rose off the ground confidently, smoothly gaining altitude till it hovered around eye level. The engineer next to me tapped his tablet and the copter moved out, beginning its search of the house.
I followed along behind the quadcopter, watching it navigate each room.
It had no map, no preprogrammed set of instructions for where to go. The drone was told merely to search and report back, and so it did. As it moved through the house it scanned each room with a laser range-finding LIDAR sensor, building a map as it went. Transmitted via Wi-Fi, the map appeared on the engineer’s tablet.
As the drone glided through the house, each time it came across a doorway it stopped, its LIDAR sensor probing the space beyond. The drone was programmed to explore unknown spaces until it had mapped everything. Only then would it finish its patrol and report back.
I watched the drone pause in front of an open doorway. I imagined its sensors pinging the distant wall of the other room, its algorithms computing that there must be unexplored space beyond the opening. The drone hovered for a moment, then moved into the unknown room. A thought popped unbidden into my mind: it’s curious.
It’s silly to impart such a human trait to a drone. Yet it comes so naturally to us, to imbue nonhuman objects with emotions, thoughts, and intentions. I was reminded of a small walking robot I had seen in a university lab years ago. The researchers taped a face to one end of the robot—nothing fancy, just slices of colored construction paper in the shape of eyes, a nose, and a mouth. I asked them why. Did it help them remember which direction was forward? No, they said. It just made them feel better to put a face on it. It made the robot seem more human, more like us. There’s
something deep in human nature that wants to connect to another sentient entity, to know that it is like us. There’s something alien and chilling about entities that can move intelligently through the world and not feel any emotion or thought beyond their own programming. There is something predatory and remorseless about them, like a shark.
I shook off the momentary feeling and reminded myself of what the technology was actually doing. The drone “felt” nothing. The computer controlling its actions would have identified that there was a gap where the LIDAR sensors could not reach and so, following its programming, directed the drone to enter the room.
The technology was impressive. The company I was observing, Shield AI, was demonstrating fully autonomous indoor flight, an even more impressive feat than tracking a person and avoiding obstacles outdoors.
Founded by brothers Ryan and Brandon Tseng, the former an engineer and the latter a former Navy SEAL, Shield AI has been pushing the boundaries of autonomy under a grant from the U.S. military. Shield’s goal is to field fully autonomous quadcopters that special operators can launch into an unknown building and have the drones work cooperatively to map the building on their own, sending back footage of the interior and potential objects of interest to the special operators waiting outside.
Brandon described their goal as “highly autonomous swarms of robots that require minimal human input. That’s the end-state. We envision that the DoD will have ten times more robots on the battlefield than soldiers, protecting soldiers and innocent civilians.” Shield’s work is pushing the boundaries of what is possible today. All the pieces of the technology are falling into place. The quadcopter I witnessed was using LIDAR for navigation, but Shield’s engineers explained they had tested visual-aided navigation; they simply didn’t have it active that day.
Visual-aided navigation is a critically important piece of technology that will allow drones to move autonomously through cluttered environments without the aid of GPS. Visual-aided navigation tracks how objects move through the camera’s field of view, a process called “optical flow.” By assessing optical flow, operating on the assumption that most of the environment is static and not moving, fixed objects moving through the camera’s field of vision can be used as a reference point for the drone’s own movement. This can allow the drone to determine how it is moving within its environment without relying on GPS or other external navigation aids.
Visual-aided navigation can complement other internal guidance mechanisms, such as inertial measurement units (IMU) that work like a drone’s “inner ear,” sensing changes in velocity. (Imagine sitting blindfolded in a car, feeling the motion of the car’s acceleration, braking, and turning.) When IMUs and visual-aided navigation are combined, they make an extremely powerful tool for determining a drone’s position, allowing the drone to accurately navigate through cluttered environments without GPS.
Visual-aided navigation has been demonstrated in numerous laboratory settings and will no doubt trickle down to commercial quadcopters over time. There is certain to be a market for quadcopters that can autonomously navigate indoors, from filming children’s birthday parties to indoor drone racing. With visual-aided navigation and other features, drones and other robotic systems will increasingly be able to move intelligently through their environment. Shield AI, like many tech companies, was focused on near- term applications, but Brandon Tseng was bullish on the long-term potential of AI and autonomy. “Robotics and artificial intelligence are where the internet was in 1994,” he told me. “Robotics and AI are about to have a really transformative impact on the world. . . . Where we see the technology 10 to 15 years down the road? It is going to be mind-blowing, like a sci-fi movie.”
Autonomous navigation is not the same as autonomous targeting, though. Drones that can maneuver and avoid obstacles on their own—
indoors or outdoors—do not necessarily have the ability to identify and discriminate among the various objects in their surroundings. They simply avoid hitting anything at all. Searching for specific objects and targeting them for action—whether it’s taking photographs or something more nefarious—would require more intelligence.
The ability to do target identification is the key missing link in building a DIY autonomous weapon. An autonomous weapon is one that can search for, decide to engage, and engage targets. That requires three abilities: the ability to maneuver intelligently through the environment to search; the ability to discriminate among potential targets to identify the correct ones;
and the ability to engage targets, presumably through force. The last element has already been demonstrated—people have armed drones on their own. The first element, the ability to autonomously navigate and search an area, is already available outdoors and is coming soon indoors. Target
identification is the only piece remaining, the only obstacle to someone making an autonomous weapon in their garage. Unfortunately, that technology is not far off. In fact, as I stood in the basement of the building watching Shield AI’s quadcopter autonomously navigate from room to room, autonomous target recognition was literally being demonstrated right outside, just above my head.