Real-time estimation of fast egomotion with feature classification using compound omnidirectional vision sensor

Trung Thanh Ngo, Yuichiro Kojima, Hajime Nagahara, Ryusuke Sagawa, Yasuhiro Mukaigawa, Masahiko Yachida, Yasushi Yagi

研究成果: Contribution to journalArticle査読

4 被引用数 (Scopus)

抄録

For fast egomotion of a camera, computing feature correspondence and motion parameters by global search becomes highly timeconsuming. Therefore, the complexity of the estimation needs to be reduced for real-time applications. In this paper, we propose a compound omnidirectional vision sensor and an algorithm for estimating its fast egomotion. The proposed sensor has both multi-baselines and a large field of view (FOV). Our method uses the multi-baseline stereo vision capability to classify feature points as near or far features. After the classification, we can estimate the camera rotation and translation separately by using random sample consensus (RANSAC) to reduce the computational complexity. The large FOV also improves the robustness since the translation and rotation are clearly distinguished. To date, there has been no work on combining multi-baseline stereo with large FOV characteristics for estimation, even though these characteristics are individually are important in improving egomotion estimation. Experiments showed that the proposed method is robust and produces reasonable accuracy in real time for fast motion of the sensor.

本文言語英語
ページ(範囲)152-166
ページ数15
ジャーナルIEICE Transactions on Information and Systems
E93-D
1
DOI
出版ステータス出版済み - 2010

All Science Journal Classification (ASJC) codes

  • ソフトウェア
  • ハードウェアとアーキテクチャ
  • コンピュータ ビジョンおよびパターン認識
  • 電子工学および電気工学
  • 人工知能

フィンガープリント

「Real-time estimation of fast egomotion with feature classification using compound omnidirectional vision sensor」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル