Visual Odometry Drift Reduction
|
|
Feature point matching is a critical step to visual
odometry computation and many other vision applications. Frame-to-frame
ego-motion drift caused by feature mismatching is the main challenge
for visual odometry. This paper presents a visual odometry algorithm
that uses a newly developed feature descriptor called synthetic basis
descriptor to obtain accurate feature matching and reduce the
drift. An initial estimate of the camera motion is calculated
using matching feature pairs. Feature points in the current frame
are then transformed to the next frame using this initial estimate of
camera motion. The sample means between the matched points and the
transformed points in the next frame are used to obtain the final
estimate of camera motion to reduce the drift or re-projection
error. Our algorithm uses a sliding window approach to extend
feature transformation into subsequent frames to overcome the
limitation of the short baseline nature of visual odometry. The
accuracy of the proposed system is evaluated and compared with
competent VO methods along with ground truth (GPS+IMUs data).
|
Graduate Students:
|
Alok Desai
|
Publications:
-
A.
Desai and D.J. Lee, “Visual Odometry Drift Reduction Using SYBA
Descriptor and Feature Transformation, “ IEEE Transactions on
Intelligent Transportation Systems, vol. 17/7, p. 1839-1851, July, 2016.
|
(Click
image to view.)
|
|
|