BYU Home page BRIGHAM YOUNG UNIVERSITY  
Search BYU 
Feedback   |   Help

       Feature point matching is a critical step to visual odometry computation and many other vision applications. Frame-to-frame ego-motion drift caused by feature mismatching is the main challenge for visual odometry. This paper presents a visual odometry algorithm that uses a newly developed feature descriptor called synthetic basis descriptor to obtain accurate feature matching and reduce the drift.  An initial estimate of the camera motion is calculated using matching feature pairs.  Feature points in the current frame are then transformed to the next frame using this initial estimate of camera motion. The sample means between the matched points and the transformed points in the next frame are used to obtain the final estimate of camera motion to reduce the drift or re-projection error.  Our algorithm uses a sliding window approach to extend feature transformation into subsequent frames to overcome the limitation of the short baseline nature of visual odometry. The accuracy of the proposed system is evaluated and compared with competent VO methods along with ground truth (GPS+IMUs data).

 Graduate Students:

  Alok Desai

Publications:
  1. A. Desai and D.J. Lee, “Visual Odometry Drift Reduction Using SYBA Descriptor and Feature Transformation, “ IEEE Transactions on Intelligent Transportation Systems, vol. 17/7, p. 1839-1851, July, 2016.
(Click image to view.)
SYBA Visual Odometry
Feature Transformation

 

Maintained by the ECEn Web Team
Copyright © 1994-2013. Brigham Young University. All Rights Reserved.