Inside Unmanned Systems

OCT-NOV 2016

Inside Unmanned Systems provides actionable business intelligence to decision-makers and influencers operating within the global UAS community. Features include analysis of key technologies, policy/regulatory developments and new product design.

Issue link:

Contents of this Issue


Page 45 of 71

46 unmanned systems inside October/November 2016 AIR AND LAND TECHNOLOGY FOCUS: LiDAR "We look at the reflectivity values. Every ob- ject ref lects a certain amount of light back," Velodyne LiDAR product line manager Wayne Seto said. "It also looks at retroreflectors. White lines have a certain retro ref lectivity value to them so you can see them at night. The same with license plates and traffic signs. So if we have a LiDAR on our vehicle and we're driving autonomously we can scan ahead and say ok, this is a Toyota Camry in front of us because it has a license plate and we can see the rear lights are functioning. Now we can see those lights are breaking by their ref lectivity. We can also tell the distance is shrinking between the cars. You now have two signals to help with navigation." LiDAR technology remained relatively un- changed until recent years, said Mark Romano, senior product manager at Harris Geospatial Solutions. The original linear systems were high power and low sensitivity, while newer systems, such as the Geiger-mode LiDAR from Harris that's been used in military applications, offer low power with high sensitivity. They can col- lect higher resolution 3D data at a much faster rate—making them more suitable for driverless cars and UAS than earlier versions. Elevation Photos courtesy of Hypack, Flir, microdrones, BaySpec,, SenixCorporation, Ford and Justin Miller (MIT) When using LIDAR IN DRIVERLESS CARS , manufacturers have to operate within the safety limitations regulated by the government, said Jim McBride, tech- nical leader for autonomous vehicles at Ford. These regulations limit how far out the laser beams can look. "Right now we would like to identify ob- jects beyond stopping distance of the vehicle," he said. "The wavelength of the laser that's most economically feasible and that meets the safety rules can bare- ly see on the horizon. We're trying to eek out every bit of the sensors we can and stay in the regulations. We can switch to another wavelength but if you move from near infrared to mid-infrared we can't operate using silicon, which is the cheap- est semiconductor. We have to switch to more exotic materials and cost and com- plexity go up dramatically. Those are the tradeoffs we're trying to fi gure out." TECHNICAL CHALLENGES Multispectral cameras These cameras are capable of sensing and recording radiation from invisible as well as visible parts of the electromagnetic spectrum. Source: Hyperspectral cameras Hyperspectral imaging, or imaging spectroscopy, combines digital imaging with spectroscopy. For each pixel in an image, a hyperspectral camera acquires the light intensity for a large number of contiguous spectral bands. Every pixel in the image contains a continuous spectrum (in radiance or refl ectance) and can be used to precisely characterize objects. Source: HySpex RGB cameras An RGB camera delivers the three basic color components (red, green, and blue) on three different wires. It typically uses three independent CCD sensors to obtain the signals and is designed for acquiring accurate color images. Source: National Instruments "YOU NEED A SENSING SYSTEM that can see completely around the vehicle, that can see beyond the stopping distance of the vehicle and that can detect things in really fi ne resolution." Jim McBride, technical leader, autonomous vehicles, Ford Infrared cameras These cameras detect infrared energy (heat) and convert it into an electronic signal. This is processed to produce a thermal image on a video monitor and perform temperature calculations. Source: FLIR SENSORS COMMONLY FOUND ON DRONES

Articles in this issue

Links on this page

Archives of this issue

view archives of Inside Unmanned Systems - OCT-NOV 2016