Inside Unmanned Systems

OCT-NOV 2016

Inside Unmanned Systems provides actionable business intelligence to decision-makers and influencers operating within the global UAS community. Features include analysis of key technologies, policy/regulatory developments and new product design.

Issue link: http://insideunmanned.epubxp.com/i/741945

Contents of this Issue

Navigation

Page 66 of 71

67 unmanned systems inside October/November 2016 ENGINEERING. PRACTICE. POLICY. by Kevin Dennehy ate 3-D images at 200 meters vs. cameras that, in relative terms, have limited range and field of view, Jellen said. "Furthermore, cameras are susceptible to changing light conditions and shadows. This could be problematic for com- muters traveling during dawn and dusk, as an example," he said. "Meanwhile, LiDAR sensors operate well in most lighting conditions, and in fact, LiDAR sensors operate well without any lighting at all." In August, Velodyne received a $150 million investment from Ford and Chinese Internet search engine giant Baidu. GNSS Sensors Don't Get the Hype The main navigation and guidance sensor for any autonomous system starts with its GNSS receiver. However, compared to cameras, Li- DAR, radar and other sensors, GNSS doesn't get the publicity in most published reports, said Jon Auld, NovAtel director, safety criti- cal systems. "I may be biased here, but I think some in the industry are underestimating the GNSS component as a critical sensor. At present, Such autonomous vehicle sensor technology as LiDAR, cameras and GNSS receivers are continuing to go down in price per unit. However, one of the main concerns for driverless system developers is to decide what systems work best, either standalone, or in conjunction with other devices, to achieve full autonomous driving capability. COMBINATION OF SENSORS NEEDED FOR camera and LiDAR technology are getting a lot of attention, but they have limitations in their capability for high-precision, absolute positioning," he said. "GNSS offers world- w ide coverage, all weather operation, the ability to provide positioning between two vehicles that cannot see each other and po- sitioning when road and sign markings are not visible. The public will expect an autono- mous car to just work and they will not be content unless the overall solution is avail- able at a very high rate." As with any new technology, autonomous vehicles will go through the same price pres- sures, Auld said. "Initially, the systems will be autonomous vehicle development CAMERAS ARE SUSCEPTIBLE to changing light conditions and shadows. This could be problematic for commuters traveling during dawn and dusk, as an example. Meanwhile, LiDAR sensors operate well in most lighting conditions, and in fact, LiDAR sensors operate well without any lighting at all." " Mike Jellen, Velodyne's president and COO

Articles in this issue

Archives of this issue

view archives of Inside Unmanned Systems - OCT-NOV 2016