
Parameter
Kategorien
Mehr zum Buch
Reliable motion estimation on resource-limited platforms is important for many applications. While insects solve this problem in an exemplary manner, mobile robots still require a bulky computation and sensor equipment to provide the required robustness. In this thesis, we aim for an efficient and reliable navigation system which is independent of external devices. For that, we assess highly effectual, but still application-independent, biological concepts. Based on these insights, we propose an inertial-visual system as a minimal sensor combination which still allows for efficient and robust navigation. Thereby, we focus especially on algorithms for image-based motion estimation. Different methods are developed to allow for efficient image processing and pose estimation at high frame rates. Tracking of several hundreds of features and a motion estimation from these correspondences in a few milliseconds on low-power processing units have been achieved. The precision of the motion computation is evaluated in dependence of the aperture angle, the tracking accuracy, and the number of features. In addition, we derive error propagations for image-based pose estimation algorithms. These can be used as accuracy estimate when fusing camera measurements with other sensors. We propose two different ways of combining inertial measurement units and cameras. Either the inertial data is used to support the feature tracking or it is fused with the visual motion estimates in a Kalman filter. For the spatial and temporal registration of the sensors we present also different solutions. Finally, the presented approaches are evaluated on synthetic and on real data. Furthermore, the algorithms are integrated into several applications, like hand-held 3D scanning, visual environment modeling, and driving as well as flying robots.
Buchkauf
Efficient and robust pose estimation based on inertial and visual sensing, Elmar Mair
- Sprache
- Erscheinungsdatum
- 2012
Lieferung
Zahlungsmethoden
Feedback senden
- Titel
- Efficient and robust pose estimation based on inertial and visual sensing
- Sprache
- Englisch
- Autor*innen
- Elmar Mair
- Verlag
- Verl. Dr. Hut
- Erscheinungsdatum
- 2012
- ISBN10
- 3843906041
- ISBN13
- 9783843906043
- Kategorie
- Informatik & Programmierung
- Beschreibung
- Reliable motion estimation on resource-limited platforms is important for many applications. While insects solve this problem in an exemplary manner, mobile robots still require a bulky computation and sensor equipment to provide the required robustness. In this thesis, we aim for an efficient and reliable navigation system which is independent of external devices. For that, we assess highly effectual, but still application-independent, biological concepts. Based on these insights, we propose an inertial-visual system as a minimal sensor combination which still allows for efficient and robust navigation. Thereby, we focus especially on algorithms for image-based motion estimation. Different methods are developed to allow for efficient image processing and pose estimation at high frame rates. Tracking of several hundreds of features and a motion estimation from these correspondences in a few milliseconds on low-power processing units have been achieved. The precision of the motion computation is evaluated in dependence of the aperture angle, the tracking accuracy, and the number of features. In addition, we derive error propagations for image-based pose estimation algorithms. These can be used as accuracy estimate when fusing camera measurements with other sensors. We propose two different ways of combining inertial measurement units and cameras. Either the inertial data is used to support the feature tracking or it is fused with the visual motion estimates in a Kalman filter. For the spatial and temporal registration of the sensors we present also different solutions. Finally, the presented approaches are evaluated on synthetic and on real data. Furthermore, the algorithms are integrated into several applications, like hand-held 3D scanning, visual environment modeling, and driving as well as flying robots.