Inside Unmanned Systems

DEC 2016-JAN 2017

Inside Unmanned Systems provides actionable business intelligence to decision-makers and influencers operating within the global UAS community. Features include analysis of key technologies, policy/regulatory developments and new product design.

Issue link: http://insideunmanned.epubxp.com/i/763107

Contents of this Issue

Navigation

Page 28 of 59

29 unmanned systems inside December 2016/January 2017 ENGINEERING. PRACTICE. POLICY. by Renee Knight Manufacturing facilities and warehouses are already benefiting from driverless cars in the form of self-driving industrial vehi- cles. Car manufacturers could learn a good deal about sensors and autonomous systems from those using these vehicles to deliver parts and raw materials. to LiDAR units. Others act more like a metro bus system, making timed stops throughout the day. The Seegrid vision-guided vehicles use stereo cameras, which have two or more lenses, with a separate image sensor or film frame for each lens, to successfully complete material transport. The vehicle has five pairs of cameras that take pictures of the environ- ment as it travels and and then creates a data dense, 3-D point cloud. "We can identify any given point in the facil- ity, and then as the vehicle navigates through, we compare the pictures that we take," Christensen said, noting these vehicles might move a pallet of goods that needs to be shelved in the warehouse or deliver parts to an assembly line. "Because we take the pictures in stereo we can get them in 3-D space. Monocular cameras don't give you the depth of field to really have an accurate depth measurement, but with the stereo cameras, you can compare two images and get accurate depth representation of all points. You get a good 3-D point cloud using nothing but cameras." The technology Clearpath Robotics uses is very similar to what manufacturers tend to turn to for outdoor self-driving cars, CEO Matt Rendall said, with many of them incorporat- ing LiDAR, radar and GPS inertial systems. All sensory inputs inside the OTTO are com- bined into a self-driving control system. The difference? The indoor environment is more controlled, so the OTTO only uses a 2-D laser scanner and relies on its software and sensor capabilities instead of GPS to navigate. "Ultimately the self-driving control system takes inputs from all different sensors, inter- prets the inputs and then makes decisions on the next action," Rendall said. "With a Google car it's taking a passenger from one place to the next, in a factory it's taking a pallet from the pick up to the drop off location. It's very similar." When an OTTO arrives in a factory, it takes a tour of the facility just like a new forklift driver would, Rendall said. As OTTO gets oriented, it uses the sensors on board to create a base map AUTOMATION HAS A DIRECT ECONOMIC BENEFIT and a very significant safety benefit as well. We are approaching our 500,000th mile of autonomous travel. We'll reach that before the end of the year. We have never had any accidents or injuries of any kind. Compare that to three people dying every year in forklift accidents in the U.S. There's a major swing in safety when vehicles are driven by cameras or math, versus by people." " Jeff Christensen, vice president of products and services, Seegrid

Articles in this issue

Archives of this issue

view archives of Inside Unmanned Systems - DEC 2016-JAN 2017