Inside Unmanned Systems

APR-MAY 2018

Inside Unmanned Systems provides actionable business intelligence to decision-makers and influencers operating within the global UAS community. Features include analysis of key technologies, policy/regulatory developments and new product design.

Issue link: https://insideunmanned.epubxp.com/i/969777

Contents of this Issue

Navigation

Page 58 of 67

59 April/May 2018 unmanned systems inside ENGINEERING. PRACTICE. POLICY. gin, the drones can easily crash. This suggests map-based approaches may prove impracti- cal in real-world settings when the exact posi- tion of every obstruction is unpredictable or unknown. "There have been many challenges along the way for researchers trying to push the limits of autonomous navigation in unknown envi- ronments," NanoMap lead researcher Peter Florence said. "Things are just really hard when you are in real, unknown, diverse envi- ronments, like out in a forest you've never seen before." NanoMap, on the other hand, considers a drone's position to be uncertain, and models and accounts for that uncertainty. This ap- proach can help drones more reliably f ly at higher speeds in close quarters, Florence said. Mapping techniques for drones often rely on so-called "occupancy grids," in which the many measurements that drones take are incorpo- rated into a 3-D representation of the world. However, such measurements can prove both unreliable and difficult to gather quickly, lim- iting high-speed maneuvering in cluttered spaces. Instead of operating under the assumption that avoiding obstacles requires taking many different measurements and figuring out each object's exact location in space, NanoMap aims to gather enough information to know the general area of an object. It also searches its memory of what it has seen previously to anticipate how it might best move to places it currently does not see. The MIT researchers tested NanoMap as part of the Defense Advanced Research Projects Agency's Fast Lightweight Autonomy program. Their experiments involved a DJI Flame Wheel quad-rotor drone, a dual-core Intel NUC i7 computer processor, an Intel RealSense R200 depth camera sensor for outdoor environments, an ASUS Xtion depth camera sensor for indoor environments, and a Point Grey Flea3 camera and ADIS 16448 inertial measurement unit to help the drone keep track of its position and orientation. Using NanoMap, the drone could f ly at speeds of up to 10 meters per second in for- ested canopy environments and 8 meters per second in indoor warehouse environments. "Working on robots racing through forests is just intrinsically super fun to me," Florence said. These experiments revealed how much accounting for uncer- tainty helped the drone f ly. For example, if NanoMap did not model for uncertainty and the drone drifted just 5 percent away from where it was expected to be, it would crash 28 percent of f lights. Meanwhile, when the drone accounted for uncertainty, the crash rate dropped to 2 percent of these f lights. "The results of MIT's NanoMap are clearly impressive. Flying at that speed is definitely a challenge from the algorithmic and hardware point of view," said Antonio Loquercio, an ar- tificial intelligence researcher at the University of Zurich in Switzerland, who did not take part in the research on NanoMap. Drone speed could also increase given depth sensors with greater range and drones with better acceleration and deceleration ca- pabilities. Future versions of NanoMap may BY THE NUMBERS Up to 10 Meters/Second The speed a NanoMap- equipped drone could fl y in a forest Up to 8 Meters/Second The speed a NanoMap- equipped drone could fl y in a warehouse Photo courtesy Jonathan How, MIT " THINGS ARE JUST REALLY HARD WHEN YOU ARE IN REAL, UNKNOWN, DIVERSE ENVIRONMENTS, LIKE OUT IN A FOREST YOU'VE NEVER SEEN BEFORE." Peter Florence, lead researcher, NanoMap

Articles in this issue

Archives of this issue

view archives of Inside Unmanned Systems - APR-MAY 2018