Inside Unmanned Systems

OCT-NOV 2016

Inside Unmanned Systems provides actionable business intelligence to decision-makers and influencers operating within the global UAS community. Features include analysis of key technologies, policy/regulatory developments and new product design.

Issue link: http://insideunmanned.epubxp.com/i/741945

Contents of this Issue

Navigation

Page 46 of 71

47 unmanned systems inside October/November 2016 ENGINEERING. PRACTICE. POLICY. data collected via LiDAR is used for a variety of applications, from infrastructure mapping to determining how close a tree is to a powerline to mapping areas devastated by flooding. LiDAR has the ability to measure range within centimeters or even sub centimeters, Singh said, and does so rapidly at a very high resolution—even at night. If a driverless car, for example, relies only on cameras to collect data, those cameras might not get enough res- olution if it's too dark or too bright or if there's a sudden change in weather conditions, leav- ing big holes in the car's understanding of the world around it. Sravan Puttagunta, CEO of Civil Maps, de- scribes LiDAR as a "ground troop sensor when it comes to spatial data." To create accurate maps for driverless cars, Puttagunta and his team fuse information collected via GPS, IMUs, LiDAR and cameras together into a point cloud. "Regardless of the platform, LiDAR gives you a very detailed data set," said James Wilder Young, senior geomatics technologist for Mer- rick & Company, who is working on LiDAR- based solutions for driverless cars. "It gives you the ability to do a lot of analysis for different ap- plications. Flood plain mapping, biomass anal- ysis, controlling autonomous vehicles. There's just an infinite amount of applications." Driverless Cars Thoughts turned to using LiDAR in driver- less cars during the first DARPA Grand Chal- lenge in 2005, Velodyne LiDAR Product Line Manager Wayne Seto said, and it became even more popular at the 2007 event. By combin- ing camera systems with LiDAR, participants got more accurate and robust information, an approach still employed today. LiDAR is now being incorporated into the 360-degree view capability in new driverless cars for collision avoidance. "If you're an unmanned system or driverless car you want to know how far objects or obsta- cles are from you so computer algorithms can plan the best possible path to navigate around those objects," Seto said. "If the car only sees in 2D, it can only make an approximation of what distance it is from an object. Knowing the ex- act distance will reduce the chance for errors and the probability of making a mistake." Because traditional LiDAR is too big and ex- pensive to effectively implement in driverless cars, some manufacturers continue to focus on radar and camera systems, Romano said. It's simply not feasible to mount one of these large laser scanners onto a passenger car. To get around that problem, LiDAR manufacturers are developing smaller systems that give these OTHER SENSORS MANUFACTURERS AND RESEARCHERS ARE USING IN AUTONOMOUS CARS RGB cameras An RGB camera delivers the three basic color components (red, green, and blue) on three different wires. It typically uses three independent CCD sensors to obtain the signals and is designed for acquiring accurate color images. Source: National Instruments Radar This system typically consists of a synchronized radio transmitter and receiver that emits radio waves and processes their refl ections for display. Radar is used for detecting and locating objects or surface features. Source: merriam-webster.com Ultrasonic sensors Ultrasonic sensors measure the distance to an object by sending a sound pulse above the range of human hearing toward the target. It then measures the time it takes the sound echo to return. Source: Senix FOR A LOOK at how MIT and Ford used LiDAR to create autonomous shuttle buses on campus, visit insideunmannedsystems.com. RELATED STORIES ONLINE INTEGRATING LiDAR Drone manufacturers are interested in integrating small LiDAR systems onto their platforms and Velodyne is working with some of those companies now, Velodyne LiDAR Product Line Manager Wayne Seto said. This can be a complicated, time-consuming task and requires a small onboard embedded computer system that is attached to the LiDAR that is attached to the drone. To successfully integrate something like that, the manufacturers have to understand how it all works together.

Articles in this issue

Links on this page

Archives of this issue

view archives of Inside Unmanned Systems - OCT-NOV 2016