Enhancing the generalization ability of neural networks by using gram-schmidt orthogonalization algorithm

W. Wan, K. Hirasawa, J. Hu, Junichi Murata

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Generalization ability of neural networks is the most important criterion to determine whether one algorithm is powerful or not. Many new algorithms have been devised to enhance the generalization ability of neural networks[1][2]. In this paper a new algorithm using the Gram-Schmidt orthogonalization algorithm [3] to the outputs of nodes in the hidden layers is proposed with the aim to reduce the interference among the nodes in the hidden layers, which is much more efficient than the regularizers methods. Simulation results confirm the above assertion.

Original languageEnglish
Title of host publicationProceedings of the International Joint Conference on Neural Networks
Pages1721-1726
Number of pages6
Volume3
Publication statusPublished - 2001
EventInternational Joint Conference on Neural Networks (IJCNN'01) - Washington, DC, United States
Duration: Jul 15 2001Jul 19 2001

Other

OtherInternational Joint Conference on Neural Networks (IJCNN'01)
CountryUnited States
CityWashington, DC
Period7/15/017/19/01

All Science Journal Classification (ASJC) codes

  • Software

Fingerprint Dive into the research topics of 'Enhancing the generalization ability of neural networks by using gram-schmidt orthogonalization algorithm'. Together they form a unique fingerprint.

  • Cite this

    Wan, W., Hirasawa, K., Hu, J., & Murata, J. (2001). Enhancing the generalization ability of neural networks by using gram-schmidt orthogonalization algorithm. In Proceedings of the International Joint Conference on Neural Networks (Vol. 3, pp. 1721-1726)