Fast competition approach using self organizing map for lip-reading applications

Alaa Sagheer, Nayouki Tsuruta, Rin-Ichiro Taniguchi, Daisaku Arita, Sakashi Maeda

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

The Self Organizing Map, or Kohonen SOM, is one of the most widely used neural network paradigm based on unsupervised competitive learning. However, the search algorithm introduced by Kohonen is slow when the size of the map is large. This slowness is caused by seeking about the best matching neuron among "all" the map neurons which tuned to "each" input sample. In this paper, we present a new strategy capable to accelerate the SOM's competition algorithm. Instead of Kohonen SOM strategy, the new approach concerns "only" with the neurons which are aligned along the low-order principal components of the feature space and neglects the rest of neurons. The idea is based on the fact that most of the data variance lie on the low-order principal components of the manifold which often contain the most important features of the data [1] [6]. The new SOM can works effectively as a feature extractor for all kinds of manifolds even in the curved ones. Two data sets are utilized to illustrate how the proposed algorithm reduces the computation efforts (or time) of SOM effectively. For N-dimensions feature space, it is shown here that the computation effort to get the best matching units is reduced to O(D1+ D2+...+ DN) instead of O(D 1 × D2 × ... × DN), where Di is the number of neurons through the dimension i. Also, under same experimental conditions, our method computation time is less than that of fast DCT by sixth times. In all cases, the new SOM shows, at least, same recognition accuracy or may be better.

Original languageEnglish
Title of host publicationInternational Joint Conference on Neural Networks 2006, IJCNN '06
Pages3775-3782
Number of pages8
Publication statusPublished - Dec 1 2006
EventInternational Joint Conference on Neural Networks 2006, IJCNN '06 - Vancouver, BC, Canada
Duration: Jul 16 2006Jul 21 2006

Publication series

NameIEEE International Conference on Neural Networks - Conference Proceedings
ISSN (Print)1098-7576

Other

OtherInternational Joint Conference on Neural Networks 2006, IJCNN '06
CountryCanada
CityVancouver, BC
Period7/16/067/21/06

Fingerprint

Self organizing maps
Neurons
Neural networks

All Science Journal Classification (ASJC) codes

  • Software

Cite this

Sagheer, A., Tsuruta, N., Taniguchi, R-I., Arita, D., & Maeda, S. (2006). Fast competition approach using self organizing map for lip-reading applications. In International Joint Conference on Neural Networks 2006, IJCNN '06 (pp. 3775-3782). [1716618] (IEEE International Conference on Neural Networks - Conference Proceedings).

Fast competition approach using self organizing map for lip-reading applications. / Sagheer, Alaa; Tsuruta, Nayouki; Taniguchi, Rin-Ichiro; Arita, Daisaku; Maeda, Sakashi.

International Joint Conference on Neural Networks 2006, IJCNN '06. 2006. p. 3775-3782 1716618 (IEEE International Conference on Neural Networks - Conference Proceedings).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Sagheer, A, Tsuruta, N, Taniguchi, R-I, Arita, D & Maeda, S 2006, Fast competition approach using self organizing map for lip-reading applications. in International Joint Conference on Neural Networks 2006, IJCNN '06., 1716618, IEEE International Conference on Neural Networks - Conference Proceedings, pp. 3775-3782, International Joint Conference on Neural Networks 2006, IJCNN '06, Vancouver, BC, Canada, 7/16/06.
Sagheer A, Tsuruta N, Taniguchi R-I, Arita D, Maeda S. Fast competition approach using self organizing map for lip-reading applications. In International Joint Conference on Neural Networks 2006, IJCNN '06. 2006. p. 3775-3782. 1716618. (IEEE International Conference on Neural Networks - Conference Proceedings).
Sagheer, Alaa ; Tsuruta, Nayouki ; Taniguchi, Rin-Ichiro ; Arita, Daisaku ; Maeda, Sakashi. / Fast competition approach using self organizing map for lip-reading applications. International Joint Conference on Neural Networks 2006, IJCNN '06. 2006. pp. 3775-3782 (IEEE International Conference on Neural Networks - Conference Proceedings).
@inproceedings{ae6d6962cfb04d09843638826c633d8e,
title = "Fast competition approach using self organizing map for lip-reading applications",
abstract = "The Self Organizing Map, or Kohonen SOM, is one of the most widely used neural network paradigm based on unsupervised competitive learning. However, the search algorithm introduced by Kohonen is slow when the size of the map is large. This slowness is caused by seeking about the best matching neuron among {"}all{"} the map neurons which tuned to {"}each{"} input sample. In this paper, we present a new strategy capable to accelerate the SOM's competition algorithm. Instead of Kohonen SOM strategy, the new approach concerns {"}only{"} with the neurons which are aligned along the low-order principal components of the feature space and neglects the rest of neurons. The idea is based on the fact that most of the data variance lie on the low-order principal components of the manifold which often contain the most important features of the data [1] [6]. The new SOM can works effectively as a feature extractor for all kinds of manifolds even in the curved ones. Two data sets are utilized to illustrate how the proposed algorithm reduces the computation efforts (or time) of SOM effectively. For N-dimensions feature space, it is shown here that the computation effort to get the best matching units is reduced to O(D1+ D2+...+ DN) instead of O(D 1 × D2 × ... × DN), where Di is the number of neurons through the dimension i. Also, under same experimental conditions, our method computation time is less than that of fast DCT by sixth times. In all cases, the new SOM shows, at least, same recognition accuracy or may be better.",
author = "Alaa Sagheer and Nayouki Tsuruta and Rin-Ichiro Taniguchi and Daisaku Arita and Sakashi Maeda",
year = "2006",
month = "12",
day = "1",
language = "English",
isbn = "0780394909",
series = "IEEE International Conference on Neural Networks - Conference Proceedings",
pages = "3775--3782",
booktitle = "International Joint Conference on Neural Networks 2006, IJCNN '06",

}

TY - GEN

T1 - Fast competition approach using self organizing map for lip-reading applications

AU - Sagheer, Alaa

AU - Tsuruta, Nayouki

AU - Taniguchi, Rin-Ichiro

AU - Arita, Daisaku

AU - Maeda, Sakashi

PY - 2006/12/1

Y1 - 2006/12/1

N2 - The Self Organizing Map, or Kohonen SOM, is one of the most widely used neural network paradigm based on unsupervised competitive learning. However, the search algorithm introduced by Kohonen is slow when the size of the map is large. This slowness is caused by seeking about the best matching neuron among "all" the map neurons which tuned to "each" input sample. In this paper, we present a new strategy capable to accelerate the SOM's competition algorithm. Instead of Kohonen SOM strategy, the new approach concerns "only" with the neurons which are aligned along the low-order principal components of the feature space and neglects the rest of neurons. The idea is based on the fact that most of the data variance lie on the low-order principal components of the manifold which often contain the most important features of the data [1] [6]. The new SOM can works effectively as a feature extractor for all kinds of manifolds even in the curved ones. Two data sets are utilized to illustrate how the proposed algorithm reduces the computation efforts (or time) of SOM effectively. For N-dimensions feature space, it is shown here that the computation effort to get the best matching units is reduced to O(D1+ D2+...+ DN) instead of O(D 1 × D2 × ... × DN), where Di is the number of neurons through the dimension i. Also, under same experimental conditions, our method computation time is less than that of fast DCT by sixth times. In all cases, the new SOM shows, at least, same recognition accuracy or may be better.

AB - The Self Organizing Map, or Kohonen SOM, is one of the most widely used neural network paradigm based on unsupervised competitive learning. However, the search algorithm introduced by Kohonen is slow when the size of the map is large. This slowness is caused by seeking about the best matching neuron among "all" the map neurons which tuned to "each" input sample. In this paper, we present a new strategy capable to accelerate the SOM's competition algorithm. Instead of Kohonen SOM strategy, the new approach concerns "only" with the neurons which are aligned along the low-order principal components of the feature space and neglects the rest of neurons. The idea is based on the fact that most of the data variance lie on the low-order principal components of the manifold which often contain the most important features of the data [1] [6]. The new SOM can works effectively as a feature extractor for all kinds of manifolds even in the curved ones. Two data sets are utilized to illustrate how the proposed algorithm reduces the computation efforts (or time) of SOM effectively. For N-dimensions feature space, it is shown here that the computation effort to get the best matching units is reduced to O(D1+ D2+...+ DN) instead of O(D 1 × D2 × ... × DN), where Di is the number of neurons through the dimension i. Also, under same experimental conditions, our method computation time is less than that of fast DCT by sixth times. In all cases, the new SOM shows, at least, same recognition accuracy or may be better.

UR - http://www.scopus.com/inward/record.url?scp=40649122802&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=40649122802&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:40649122802

SN - 0780394909

SN - 9780780394902

T3 - IEEE International Conference on Neural Networks - Conference Proceedings

SP - 3775

EP - 3782

BT - International Joint Conference on Neural Networks 2006, IJCNN '06

ER -