Real-time estimation of fast egomotion with feature classification using compound omnidirectional vision sensor

Trung Thanh Ngo, Yuichiro Kojima, Hajime Nagahara, Ryusuke Sagawa, Yasuhiro Mukaigawa, Masahiko Yachida, Yasushi Yagi

Research output: Contribution to journalArticle

4 Citations (Scopus)

Abstract

For fast egomotion of a camera, computing feature correspondence and motion parameters by global search becomes highly timeconsuming. Therefore, the complexity of the estimation needs to be reduced for real-time applications. In this paper, we propose a compound omnidirectional vision sensor and an algorithm for estimating its fast egomotion. The proposed sensor has both multi-baselines and a large field of view (FOV). Our method uses the multi-baseline stereo vision capability to classify feature points as near or far features. After the classification, we can estimate the camera rotation and translation separately by using random sample consensus (RANSAC) to reduce the computational complexity. The large FOV also improves the robustness since the translation and rotation are clearly distinguished. To date, there has been no work on combining multi-baseline stereo with large FOV characteristics for estimation, even though these characteristics are individually are important in improving egomotion estimation. Experiments showed that the proposed method is robust and produces reasonable accuracy in real time for fast motion of the sensor.

Original languageEnglish
Pages (from-to)152-166
Number of pages15
JournalIEICE Transactions on Information and Systems
VolumeE93-D
Issue number1
DOIs
Publication statusPublished - Jan 1 2010

Fingerprint

Sensors
Cameras
Stereo vision
Computational complexity
Experiments

All Science Journal Classification (ASJC) codes

  • Software
  • Hardware and Architecture
  • Computer Vision and Pattern Recognition
  • Electrical and Electronic Engineering
  • Artificial Intelligence

Cite this

Real-time estimation of fast egomotion with feature classification using compound omnidirectional vision sensor. / Ngo, Trung Thanh; Kojima, Yuichiro; Nagahara, Hajime; Sagawa, Ryusuke; Mukaigawa, Yasuhiro; Yachida, Masahiko; Yagi, Yasushi.

In: IEICE Transactions on Information and Systems, Vol. E93-D, No. 1, 01.01.2010, p. 152-166.

Research output: Contribution to journalArticle

Ngo, Trung Thanh ; Kojima, Yuichiro ; Nagahara, Hajime ; Sagawa, Ryusuke ; Mukaigawa, Yasuhiro ; Yachida, Masahiko ; Yagi, Yasushi. / Real-time estimation of fast egomotion with feature classification using compound omnidirectional vision sensor. In: IEICE Transactions on Information and Systems. 2010 ; Vol. E93-D, No. 1. pp. 152-166.
@article{47790f38c3f2438ea7d76a7b89cf4669,
title = "Real-time estimation of fast egomotion with feature classification using compound omnidirectional vision sensor",
abstract = "For fast egomotion of a camera, computing feature correspondence and motion parameters by global search becomes highly timeconsuming. Therefore, the complexity of the estimation needs to be reduced for real-time applications. In this paper, we propose a compound omnidirectional vision sensor and an algorithm for estimating its fast egomotion. The proposed sensor has both multi-baselines and a large field of view (FOV). Our method uses the multi-baseline stereo vision capability to classify feature points as near or far features. After the classification, we can estimate the camera rotation and translation separately by using random sample consensus (RANSAC) to reduce the computational complexity. The large FOV also improves the robustness since the translation and rotation are clearly distinguished. To date, there has been no work on combining multi-baseline stereo with large FOV characteristics for estimation, even though these characteristics are individually are important in improving egomotion estimation. Experiments showed that the proposed method is robust and produces reasonable accuracy in real time for fast motion of the sensor.",
author = "Ngo, {Trung Thanh} and Yuichiro Kojima and Hajime Nagahara and Ryusuke Sagawa and Yasuhiro Mukaigawa and Masahiko Yachida and Yasushi Yagi",
year = "2010",
month = "1",
day = "1",
doi = "10.1587/transinf.E93.D.152",
language = "English",
volume = "E93-D",
pages = "152--166",
journal = "IEICE Transactions on Information and Systems",
issn = "0916-8532",
publisher = "一般社団法人電子情報通信学会",
number = "1",

}

TY - JOUR

T1 - Real-time estimation of fast egomotion with feature classification using compound omnidirectional vision sensor

AU - Ngo, Trung Thanh

AU - Kojima, Yuichiro

AU - Nagahara, Hajime

AU - Sagawa, Ryusuke

AU - Mukaigawa, Yasuhiro

AU - Yachida, Masahiko

AU - Yagi, Yasushi

PY - 2010/1/1

Y1 - 2010/1/1

N2 - For fast egomotion of a camera, computing feature correspondence and motion parameters by global search becomes highly timeconsuming. Therefore, the complexity of the estimation needs to be reduced for real-time applications. In this paper, we propose a compound omnidirectional vision sensor and an algorithm for estimating its fast egomotion. The proposed sensor has both multi-baselines and a large field of view (FOV). Our method uses the multi-baseline stereo vision capability to classify feature points as near or far features. After the classification, we can estimate the camera rotation and translation separately by using random sample consensus (RANSAC) to reduce the computational complexity. The large FOV also improves the robustness since the translation and rotation are clearly distinguished. To date, there has been no work on combining multi-baseline stereo with large FOV characteristics for estimation, even though these characteristics are individually are important in improving egomotion estimation. Experiments showed that the proposed method is robust and produces reasonable accuracy in real time for fast motion of the sensor.

AB - For fast egomotion of a camera, computing feature correspondence and motion parameters by global search becomes highly timeconsuming. Therefore, the complexity of the estimation needs to be reduced for real-time applications. In this paper, we propose a compound omnidirectional vision sensor and an algorithm for estimating its fast egomotion. The proposed sensor has both multi-baselines and a large field of view (FOV). Our method uses the multi-baseline stereo vision capability to classify feature points as near or far features. After the classification, we can estimate the camera rotation and translation separately by using random sample consensus (RANSAC) to reduce the computational complexity. The large FOV also improves the robustness since the translation and rotation are clearly distinguished. To date, there has been no work on combining multi-baseline stereo with large FOV characteristics for estimation, even though these characteristics are individually are important in improving egomotion estimation. Experiments showed that the proposed method is robust and produces reasonable accuracy in real time for fast motion of the sensor.

UR - http://www.scopus.com/inward/record.url?scp=77950197582&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=77950197582&partnerID=8YFLogxK

U2 - 10.1587/transinf.E93.D.152

DO - 10.1587/transinf.E93.D.152

M3 - Article

AN - SCOPUS:77950197582

VL - E93-D

SP - 152

EP - 166

JO - IEICE Transactions on Information and Systems

JF - IEICE Transactions on Information and Systems

SN - 0916-8532

IS - 1

ER -