Inside Unmanned Systems

APR-MAY 2016

Inside Unmanned Systems provides actionable business intelligence to decision-makers and influencers operating within the global UAS community. Features include analysis of key technologies, policy/regulatory developments and new product design.

Issue link: https://insideunmanned.epubxp.com/i/668560

Contents of this Issue

Navigation

Page 60 of 75

61 unmanned systems inside April/May 2016 ENGINEERING. PRACTICE. POLICY. It is difficult, if not impossible, to create an exhaustive list of all potential aiding measure- ments. Yet, it is possible to categorize aiding measurements into generalized types. The RIFE design is abstracted for generic sensors that are grouped into classes according to the type of their measurements. Table 1 lists generic sensor classes that are currently included into the RIFE library. To illustrate the generic class representation, consider a simple example of relative position. As formulated in Table 1, this class is defined as the projection of position change vector on a specified axis or axes of navigation-frame or body-frame. Figure 5 shows three aiding mea- surements (odometer, 2D lidar, and 3D lidar) that are represented by this generic observation. Figure 6 illustrates the operational sequenc- ing of RIFE in the plug and play mode. The plug and play functionality is realized by the object- oriented multi-sensor fusion architecture wherein various sensor modalities are abstract- ed into classes that are instantiated into sensor objects when specific sensors are connected to the system or removed when disconnected. Ac- cordingly, the filter adapts to the available types and number of measurements by repartitioning error states and system matrices without the need for new coding and compiling. The system starts with a built-in class library populated with legacy sensors. This includes the DRN class, estimator class (e.g., extended Kalman filter of a marginalized particle filter), aiding sensor class 1, aiding sensor class 2, and so forth. Objects for the DRN mechanization and estimator are created at the system startup time, while aiding sensor objects are populated in a plug and play fashion as those sensors are connected to the system. The integration filter updates the output navigation states by execut- ing the "state propagation function" of the DRN object at a "predetermined" update rate (for ex- ample, 100 Hz). When measurements from aid- ing sensors become available they are applied to estimate errors in the DRN outputs thus im- proving the navigation performance. When the measurement from a sensor comes online, the universal interface (UAID) creates an initialization message and sends it to the RIFE, which instantiates the corresponding sensor object. During the instantiation, sensor states (i.e., error states and their corresponding covariance matrices) are initialized using the measurement error models. At this point, the estimator calls its "repartitioning" function to insert "sensor states." After the sensor object is instantiated (as is the case for the aiding sen- sor 2 in Figure 3), it calls the "measurement ob- servables" function every time a measurement message is received. This function provides data to the "estimation" routine of the estima- tor object in order to generate "corrections" to the dynamic model solution, yielding the cor- rected "navigation outputs." The process repeats as new measurements come in or new plug and play sensors are added. When a sensor is dis- connected, sensor manager sends a disconnect message to RIFE, which terminates the sensor object, thus releasing memory and other re- sources allocated to it. The filter design is readily upgradable to include new types of aiding sensors. If a com- pletely new type of measurements needs to be included, it is done through the software up- Table 1: Generic Classes of the RIFE Object Library GENERIC OBSERVABLE DEFINITION EXAMPLE EXAMPLE SENSORS REPRESENTED BY THIS GENERIC OBSERVABLE Position Projection of Position vector on a specifed axis or axes of navigation or body-frame Navigation-frame position GNSS position three-axis magnetometer; RF ID; stop sign baro altimeter; radar (laser) altimeter Velocity Projection of velocity vector on a specifed axis or axes of navigation or body-frame Navigation-frame velocity Air data Attitude Any specifed combination or combinations of Euler angles (pitch, roll, yaw) Pitch and roll measurements Inclinometer; magnetometer Range Range to a source at a known location Range to GNSS satellite GNSS range; TDOA; SOOP range Relative Position Projection of position change between successive updates on a specifed axis or axes of navigation or body-frame Change in horizontal position components GNSS delta ranges; stereo-vision; position changes extracted from 2D lidar; ranges to planar surfaces extracted from 3D lidar data; odometer; step sensor Relative Attitude Change in components of specifed body-frame unit vector(s) between updates Normal vectors to planar surfaces Heading change from 2D lidar; normal vectors to planar surfaces extracted from 3D lidar data Relative Bearing Angles to a feature at unknown location Point features of monocular images Monocular camera

Articles in this issue

Archives of this issue

view archives of Inside Unmanned Systems - APR-MAY 2016