To main content

Towards Robust Visual Inertial Odometry in Feature Sparse Structured Environments

Abstract

In conventional Visual Odometry (VO), featureless scenarios, such as facing uniformly textured walls often lead to failure or reliance on fused inertial tracking. Depending on sensor quality, calibration and initialization, this can cause severe drift of the positioning solution, limiting the use of visual methods for practical applications such as first-responder missions in infrastructure-free indoor areas.To overcome this limitation, we present a novel RGB-D Inertial Odometry pipeline that fuses low-cost RGB-D camera and inertial sensor to estimate low-drift trajectories in structured environments. Our method is practical, waiving the need for extensive Inertial Measurement Unit (IMU) calibration and bias initialization while accommodating affordable sensors. We leverage the Manhattan World assumption, which posits that major man-made structures conform to three major orthogonal directions. Utilizing this assumption, we develop a dynamic Manhattan Decision Module that robustly computes absolute orientation in challenging indoor pedestrian tracking scenarios. Moreover, we propose a novel algorithm that partially recovers translational movement using depth sensor information when no visible features are present.Our experiments with commodity sensors demonstrate the superior performance and robustness of our method in estimating low-drift trajectories in challenging environments without dedicated IMU initialization movement or IMU noise parameter estimation.

Category

Academic chapter/article/Conference paper

Language

English

Author(s)

Affiliation

  • University of Helsinki
  • SINTEF Digital / Sustainable Communication Technologies

Year

2023

Publisher

IEEE conference proceedings

Book

Proceedings of the 2023 13th International Conference on Indoor Positioning and Indoor Navigation (IPIN)

ISBN

979-8-3503-2011-4

View this publication at Cristin