Universal Learning Networks with varying parameters

Kotaro Hirasawa, Jinglu Hu, Junichi Murata, Chunzhi Jin, Hironobu Etoh, Hironobu Katagiri

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Universal Learning Network (ULN) which is a super-set of supervised learning networks has been already proposed. Parameters in ULN are trained in order to optimize a criterion function as conventional neural networks, and after training they are used as constant parameters. In this paper, a new method to alter the parameters depending on the network flows is presented to enhance representation abilities of networks. In the proposed method, there exists two kinds of networks, the first one is a basic network which includes varying parameters and the other one is a network which calculates the optimal varying parameters depending on the network flows of the basic network. It is also proposed in this paper that any type of networks such as fuzzy inference networks, radial basis function networks and neural networks can be used for the basic and parameter calculation networks. From simulations where parameters in a neural network are altered by a fuzzy inference networks, it is shown that the networks with the same number of varying parameters have higher representation abilities than the conventional networks.

Original languageEnglish
Title of host publicationProceedings of the International Joint Conference on Neural Networks
PublisherIEEE
Pages1302-1307
Number of pages6
Volume2
Publication statusPublished - 1999
EventInternational Joint Conference on Neural Networks (IJCNN'99) - Washington, DC, USA
Duration: Jul 10 1999Jul 16 1999

Other

OtherInternational Joint Conference on Neural Networks (IJCNN'99)
CityWashington, DC, USA
Period7/10/997/16/99

Fingerprint

Fuzzy inference
Neural networks
Radial basis function networks
Supervised learning

All Science Journal Classification (ASJC) codes

  • Software

Cite this

Hirasawa, K., Hu, J., Murata, J., Jin, C., Etoh, H., & Katagiri, H. (1999). Universal Learning Networks with varying parameters. In Proceedings of the International Joint Conference on Neural Networks (Vol. 2, pp. 1302-1307). IEEE.

Universal Learning Networks with varying parameters. / Hirasawa, Kotaro; Hu, Jinglu; Murata, Junichi; Jin, Chunzhi; Etoh, Hironobu; Katagiri, Hironobu.

Proceedings of the International Joint Conference on Neural Networks. Vol. 2 IEEE, 1999. p. 1302-1307.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Hirasawa, K, Hu, J, Murata, J, Jin, C, Etoh, H & Katagiri, H 1999, Universal Learning Networks with varying parameters. in Proceedings of the International Joint Conference on Neural Networks. vol. 2, IEEE, pp. 1302-1307, International Joint Conference on Neural Networks (IJCNN'99), Washington, DC, USA, 7/10/99.
Hirasawa K, Hu J, Murata J, Jin C, Etoh H, Katagiri H. Universal Learning Networks with varying parameters. In Proceedings of the International Joint Conference on Neural Networks. Vol. 2. IEEE. 1999. p. 1302-1307
Hirasawa, Kotaro ; Hu, Jinglu ; Murata, Junichi ; Jin, Chunzhi ; Etoh, Hironobu ; Katagiri, Hironobu. / Universal Learning Networks with varying parameters. Proceedings of the International Joint Conference on Neural Networks. Vol. 2 IEEE, 1999. pp. 1302-1307
@inproceedings{3141619084f44d2e9b26771d69a39510,
title = "Universal Learning Networks with varying parameters",
abstract = "Universal Learning Network (ULN) which is a super-set of supervised learning networks has been already proposed. Parameters in ULN are trained in order to optimize a criterion function as conventional neural networks, and after training they are used as constant parameters. In this paper, a new method to alter the parameters depending on the network flows is presented to enhance representation abilities of networks. In the proposed method, there exists two kinds of networks, the first one is a basic network which includes varying parameters and the other one is a network which calculates the optimal varying parameters depending on the network flows of the basic network. It is also proposed in this paper that any type of networks such as fuzzy inference networks, radial basis function networks and neural networks can be used for the basic and parameter calculation networks. From simulations where parameters in a neural network are altered by a fuzzy inference networks, it is shown that the networks with the same number of varying parameters have higher representation abilities than the conventional networks.",
author = "Kotaro Hirasawa and Jinglu Hu and Junichi Murata and Chunzhi Jin and Hironobu Etoh and Hironobu Katagiri",
year = "1999",
language = "English",
volume = "2",
pages = "1302--1307",
booktitle = "Proceedings of the International Joint Conference on Neural Networks",
publisher = "IEEE",

}

TY - GEN

T1 - Universal Learning Networks with varying parameters

AU - Hirasawa, Kotaro

AU - Hu, Jinglu

AU - Murata, Junichi

AU - Jin, Chunzhi

AU - Etoh, Hironobu

AU - Katagiri, Hironobu

PY - 1999

Y1 - 1999

N2 - Universal Learning Network (ULN) which is a super-set of supervised learning networks has been already proposed. Parameters in ULN are trained in order to optimize a criterion function as conventional neural networks, and after training they are used as constant parameters. In this paper, a new method to alter the parameters depending on the network flows is presented to enhance representation abilities of networks. In the proposed method, there exists two kinds of networks, the first one is a basic network which includes varying parameters and the other one is a network which calculates the optimal varying parameters depending on the network flows of the basic network. It is also proposed in this paper that any type of networks such as fuzzy inference networks, radial basis function networks and neural networks can be used for the basic and parameter calculation networks. From simulations where parameters in a neural network are altered by a fuzzy inference networks, it is shown that the networks with the same number of varying parameters have higher representation abilities than the conventional networks.

AB - Universal Learning Network (ULN) which is a super-set of supervised learning networks has been already proposed. Parameters in ULN are trained in order to optimize a criterion function as conventional neural networks, and after training they are used as constant parameters. In this paper, a new method to alter the parameters depending on the network flows is presented to enhance representation abilities of networks. In the proposed method, there exists two kinds of networks, the first one is a basic network which includes varying parameters and the other one is a network which calculates the optimal varying parameters depending on the network flows of the basic network. It is also proposed in this paper that any type of networks such as fuzzy inference networks, radial basis function networks and neural networks can be used for the basic and parameter calculation networks. From simulations where parameters in a neural network are altered by a fuzzy inference networks, it is shown that the networks with the same number of varying parameters have higher representation abilities than the conventional networks.

UR - http://www.scopus.com/inward/record.url?scp=0033351399&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0033351399&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:0033351399

VL - 2

SP - 1302

EP - 1307

BT - Proceedings of the International Joint Conference on Neural Networks

PB - IEEE

ER -