Inside Unmanned Systems

OCT-NOV 2016

Inside Unmanned Systems provides actionable business intelligence to decision-makers and influencers operating within the global UAS community. Features include analysis of key technologies, policy/regulatory developments and new product design.

Issue link:

Contents of this Issue


Page 47 of 71

48 unmanned systems inside   October/November 2016 cars the information LiDAR provides in a much smaller, more economical form factor. Ford has worked with Velodyne LiDAR for the last 10 years, said Jim McBride, Ford's technical leader for autonomous vehicles. "When you're trying to solve a problem as complicated as this you really want a very detailed 3D representation of the world around you, so you need a sensing system that can see completely around the vehicle, that can see beyond the stopping distance of the vehicle and that can detect things in really fine resolution," McBride said. "In Li- DAR you have all of the above. You bring your own light source and actively illumi- nate the scene with a precise pinprick burst of laser beams." In August Ford joined with Chinese search engine firm Baidu—which has an- nounced plans for its own autonomous vehi- cle—to co-invest a total of $150 million into the sensor company. The goal is to mass- produce a more affordable automotive Li- DAR sensor to make it more economically feasible to put the technology in cars. Mapping LiDAR also can be used to create the maps autonomous vehicles need to safely travel, McBride said. Before any of their cars drive a route autonomously, Ford sends a manned car to drive that route to make sure it's safe. While driving, LiDAR collects the data used to create detailed maps for their au- tonomous vehicles. These maps give the ve- hicles a better idea of what to expect as they travel down the road, making them much more reliable and robust. Anything of inter- est, such as cross walks and stop signs, is annotated in the map. With such a map, cars don't have to rely on sensors seeing every detail as they travel, McBride said, and the LiDAR data can tell other sensors where to look for objects. For example, the car can't tell if a traffic light is green or red with LiDAR, but a camera can. LiDAR identifies the traffic light as an object of interest, so the camera can classify what color the light is without searching the entire space on the camera for the correct image—and that saves a lot of computation. There's also a lot of redundancy between LiDAR, radar and cameras. Those redun- dancies can be exploited to make sure any false positives are removed, leading to more reliable measurements. Young of Merrick is working with a car manufacturer to develop this kind of data- base for the company's autonomous vehicle program and said they use both airborne and mobile LiDAR to gather the data. The team collects images from the mo- bile LiDAR and ortho imagery from the air- borne platform to become part of the map- ping database. The car always knows where it is in relation to the database within 10 cm or better. The laser scanners can capture 500,000 to 1 million data points per second. The team uses both airborne LiDAR on a helicopter and mobile LiDAR on a vehicle to collect the data because there are areas on roadways, say in an urban canyon, where GPS satellite signals aren't available—compromis- ing positional solution and requiring a lot of ground control, Young said. The helicopter offers a continuous solution that requires less ground control and that can be referenced to the mobile LiDAR. The helicopter is also a better option when using the mobile solu- tion requires lane closures. Not only is getting permits for lane closures a timely process, it's costly to hire a company to close the lane. "Airborne LiDAR gives you a much better solution and more accurate data, especial- ly when you can't get on the right of way," Young said. "Some permitting processes can take up to six months and the cost can be equal to or more than actually f lying the helicopter. The helicopter needs a lot less Photos courtesy of Riegl, Velodyne and Harris Riegl VUX-1LR The VUX-1LR Long Range is a very lightweight, compact and rugged laser scanner designed for airborne surveying missions from helicopter, gyrocopter, and other small aircraft. The sensor provides a maximum measurement range of 1,350 m, an effective measurement rate of up to 750,000 measurements/sec., and has an operating flight altitude of up to 1,740 ft AGL. Velodyne Puck Hi-Res sensor The Puck Hi-Res sensor is Velodyne's latest LiDAR offering and is a version of the LiDAR Puck. It's designed for applications that require greater resolution in the captured 3D imaging. It has the VLP-16 Puck's 360° horizontal field- of-view (FoV) and 100-meter range, along with a 20° vertical FoV for a tighter channel distribution (1.33° between channels instead of 2.00°) for greater details in the 3D image at longer ranges. AIR AND LAND TECHNOLOGY FOCUS

Articles in this issue

Archives of this issue

view archives of Inside Unmanned Systems - OCT-NOV 2016