Robust and real-time egomotion estimation using a compound omnidirectional sensor

Trung Ngo Thanh, Hajime Nagahara, Ryusuke Sagawa, Yasuhiro Mukaigawa, Masahiko Yachida, Yasushi Yagi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Citations (Scopus)

Abstract

We propose a new egomotion estimation algorithm for a compound omnidirectional camera. Image features are detected by a conventional feature detector and then quickly classified into near and far features by checking infinity on the omnidirectional image of the compound omnidirectional sensor. Egomotion estimation is performed in two steps: first, rotation is recovered using far features; then translation is estimated from near features using the estimated rotation. RANSAC is used for estimations of both rotation and translation. Experiments in various environments show that our approach is robust and provides good accuracy in real-time for large motions.

Original languageEnglish
Title of host publication2008 IEEE International Conference on Robotics and Automation, ICRA 2008
Pages492-497
Number of pages6
DOIs
Publication statusPublished - Sep 18 2008
Event2008 IEEE International Conference on Robotics and Automation, ICRA 2008 - Pasadena, CA, United States
Duration: May 19 2008May 23 2008

Publication series

NameProceedings - IEEE International Conference on Robotics and Automation
ISSN (Print)1050-4729

Other

Other2008 IEEE International Conference on Robotics and Automation, ICRA 2008
CountryUnited States
CityPasadena, CA
Period5/19/085/23/08

All Science Journal Classification (ASJC) codes

  • Software
  • Control and Systems Engineering
  • Artificial Intelligence
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Robust and real-time egomotion estimation using a compound omnidirectional sensor'. Together they form a unique fingerprint.

Cite this