Inside Unmanned Systems

OCT-NOV 2016

Inside Unmanned Systems provides actionable business intelligence to decision-makers and influencers operating within the global UAS community. Features include analysis of key technologies, policy/regulatory developments and new product design.

Issue link: http://insideunmanned.epubxp.com/i/741945

Contents of this Issue

Navigation

Page 52 of 71

53 unmanned systems inside October/November 2016 ENGINEERING. PRACTICE. POLICY. WE TAKE THE GPS AND IMU DATA and get a solution that gives us the position, the coordinates and roll, pitch and yaw of a given point," Young said. "Then based on the attitude of the reading from the IMU at any given time and GPS position, we know the time the laser shoots out and what time the receiver receives the point back. Based on that time, position and attitude, we know where that point is hitting on the ground or on a feature, so we get all coordinates of all points of interest. If we have a camera we can color the points as well." " James Wilder Young, Geomatics Technologist for Merrick & Company (Above) Colorized Mobile LiDAR Merrick & Co. collected for an autonomous Vehicle Mapping Project. (Left) High density colorized LiDAR flown with HIGH Definition Mapping System. LiDAR also can help keep UAS from running into other unmanned or manned aircraft— which is essential when performing missions beyond visual line of sight. While operators must have an exemption from the FAA to fly beyond line of sight today, the industry is eager for that restriction to be lifted. Drone manufacturers have approached Civil Maps about their technology, Puttagunta said, which can create a virtual corridor for drones to f ly in while continuously reporting their lo- cation to their operators. "If all the roadways are being mapped by self-driving cars you know where buildings are, where trees are, where the road is," Put- tagunta said. "Drones can leverage that. They can follow the roadways and stay in a f light path and avoid obstacles without much assis- tance from humans. It can work the other way too. As drones are f lying they can create maps of roads and lane markings and signage. Cars can leverage that. It's a bidirectional map where a robotic platform can make contri- butions to the map and also use the map to make decisions." The Future Companies like Riegl, Velodyne and Harris are working to develop LiDAR solutions that over- come some of the main challenges of integrat- ing this technology—cost, size and reliability. Traditional laser scanners simply won't work on driverless cars or UAS. Unmanned aircraft oper- ating under the new small UAS rule, for example, can't weigh more than 55 pounds, Van Rens said, making it vital to have a high-performing, light- weight solution for these platforms. "This is a technology that hasn't had an ap- plication waiting for it," McBride said. "There hasn't been a lot of effort to manufacture it at a volume to the specs that we need, so we're kind of waiting in a sense for the rest of the industry to catch up. Certainly one of the challenges is there hasn't been a market for this technology and that takes time." The market is transitioning, Romano said, and LiDAR now makes it possible to collect high fidelity data very quickly. Both autonomous vehicles and UAS need that data, but using LiDAR just hasn't been cost effective. Now that companies are starting to create smaller, more affordable options, he expects to see a paradigm shift. "It's going to change everything," Romano said. "Reducing power, weight and size. These can be very small devices. Before they were hundreds of pounds and now there's research going on that has these devices smaller than a penny." Of course LiDAR can't do it alone. While it's one of the many tools UAS and driverless car manufacturers can use to create robust platforms, there are other pieces to the puzzle as well. "There's no independent sensor that can handle all the different use cases or scenarios you see in the real world," Puttagunta said. "There might be a combination of sensors that does a better job. Sensor fusion between cam- era, LiDAR and radar will give you the best data set to work with and when certain di- mensions aren't performing at par. During fog or rainy conditions the cameras might not be as accurate for positioning as LiDAR. LiDAR might not be able to discern color but a cam- era can. In heavy rain radar can see further distances. Sensors working in tandem will get you an overall better performance."

Articles in this issue

Archives of this issue

view archives of Inside Unmanned Systems - OCT-NOV 2016