Visual impression localization of autonomous robots

Somar Boubou, A. H.Abdul Hafez, Einoshin Suzuki

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

This paper proposes a novel localization approach based on visual impressions. We define a visual impression as the representation of a HSV color distribution of a place. The representation uses clustering feature (CF) tree to manage the color distribution and we propose to weight each CF entry to indicate its importance. The method compares the navigating tree, which is created by the robot from its observations, with the available reference trees of the environment. In addition, we propose a new similarity measure to compare two CF trees which represent the visual impressions of the corresponding two places. The method is tested on two data sets collected in different environments. The results of the experiments show the effectiveness of the proposed method.

Original languageEnglish
Title of host publication2015 IEEE Conference on Automation Science and Engineering
Subtitle of host publicationAutomation for a Sustainable Future, CASE 2015
PublisherIEEE Computer Society
Pages328-334
Number of pages7
ISBN (Electronic)9781467381833
DOIs
Publication statusPublished - Oct 7 2015
Event11th IEEE International Conference on Automation Science and Engineering, CASE 2015 - Gothenburg, Sweden
Duration: Aug 24 2015Aug 28 2015

Publication series

NameIEEE International Conference on Automation Science and Engineering
Volume2015-October
ISSN (Print)2161-8070
ISSN (Electronic)2161-8089

Other

Other11th IEEE International Conference on Automation Science and Engineering, CASE 2015
CountrySweden
CityGothenburg
Period8/24/158/28/15

Fingerprint

Robots
Color
Experiments

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Electrical and Electronic Engineering

Cite this

Boubou, S., Hafez, A. H. A., & Suzuki, E. (2015). Visual impression localization of autonomous robots. In 2015 IEEE Conference on Automation Science and Engineering: Automation for a Sustainable Future, CASE 2015 (pp. 328-334). [7294100] (IEEE International Conference on Automation Science and Engineering; Vol. 2015-October). IEEE Computer Society. https://doi.org/10.1109/CoASE.2015.7294100

Visual impression localization of autonomous robots. / Boubou, Somar; Hafez, A. H.Abdul; Suzuki, Einoshin.

2015 IEEE Conference on Automation Science and Engineering: Automation for a Sustainable Future, CASE 2015. IEEE Computer Society, 2015. p. 328-334 7294100 (IEEE International Conference on Automation Science and Engineering; Vol. 2015-October).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Boubou, S, Hafez, AHA & Suzuki, E 2015, Visual impression localization of autonomous robots. in 2015 IEEE Conference on Automation Science and Engineering: Automation for a Sustainable Future, CASE 2015., 7294100, IEEE International Conference on Automation Science and Engineering, vol. 2015-October, IEEE Computer Society, pp. 328-334, 11th IEEE International Conference on Automation Science and Engineering, CASE 2015, Gothenburg, Sweden, 8/24/15. https://doi.org/10.1109/CoASE.2015.7294100
Boubou S, Hafez AHA, Suzuki E. Visual impression localization of autonomous robots. In 2015 IEEE Conference on Automation Science and Engineering: Automation for a Sustainable Future, CASE 2015. IEEE Computer Society. 2015. p. 328-334. 7294100. (IEEE International Conference on Automation Science and Engineering). https://doi.org/10.1109/CoASE.2015.7294100
Boubou, Somar ; Hafez, A. H.Abdul ; Suzuki, Einoshin. / Visual impression localization of autonomous robots. 2015 IEEE Conference on Automation Science and Engineering: Automation for a Sustainable Future, CASE 2015. IEEE Computer Society, 2015. pp. 328-334 (IEEE International Conference on Automation Science and Engineering).
@inproceedings{ad0851f9c76846f99e8c3e3819b345ee,
title = "Visual impression localization of autonomous robots",
abstract = "This paper proposes a novel localization approach based on visual impressions. We define a visual impression as the representation of a HSV color distribution of a place. The representation uses clustering feature (CF) tree to manage the color distribution and we propose to weight each CF entry to indicate its importance. The method compares the navigating tree, which is created by the robot from its observations, with the available reference trees of the environment. In addition, we propose a new similarity measure to compare two CF trees which represent the visual impressions of the corresponding two places. The method is tested on two data sets collected in different environments. The results of the experiments show the effectiveness of the proposed method.",
author = "Somar Boubou and Hafez, {A. H.Abdul} and Einoshin Suzuki",
year = "2015",
month = "10",
day = "7",
doi = "10.1109/CoASE.2015.7294100",
language = "English",
series = "IEEE International Conference on Automation Science and Engineering",
publisher = "IEEE Computer Society",
pages = "328--334",
booktitle = "2015 IEEE Conference on Automation Science and Engineering",
address = "United States",

}

TY - GEN

T1 - Visual impression localization of autonomous robots

AU - Boubou, Somar

AU - Hafez, A. H.Abdul

AU - Suzuki, Einoshin

PY - 2015/10/7

Y1 - 2015/10/7

N2 - This paper proposes a novel localization approach based on visual impressions. We define a visual impression as the representation of a HSV color distribution of a place. The representation uses clustering feature (CF) tree to manage the color distribution and we propose to weight each CF entry to indicate its importance. The method compares the navigating tree, which is created by the robot from its observations, with the available reference trees of the environment. In addition, we propose a new similarity measure to compare two CF trees which represent the visual impressions of the corresponding two places. The method is tested on two data sets collected in different environments. The results of the experiments show the effectiveness of the proposed method.

AB - This paper proposes a novel localization approach based on visual impressions. We define a visual impression as the representation of a HSV color distribution of a place. The representation uses clustering feature (CF) tree to manage the color distribution and we propose to weight each CF entry to indicate its importance. The method compares the navigating tree, which is created by the robot from its observations, with the available reference trees of the environment. In addition, we propose a new similarity measure to compare two CF trees which represent the visual impressions of the corresponding two places. The method is tested on two data sets collected in different environments. The results of the experiments show the effectiveness of the proposed method.

UR - http://www.scopus.com/inward/record.url?scp=84952772513&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84952772513&partnerID=8YFLogxK

U2 - 10.1109/CoASE.2015.7294100

DO - 10.1109/CoASE.2015.7294100

M3 - Conference contribution

T3 - IEEE International Conference on Automation Science and Engineering

SP - 328

EP - 334

BT - 2015 IEEE Conference on Automation Science and Engineering

PB - IEEE Computer Society

ER -