A variety of neuron models combine the neural inputs through their summation and sigmoidal functions. Such structure of neural networks leads to shortcomings such as a large number of neurons in hidden layers and huge training data required. We introduce a kind of multiplication neuron which multiplies their inputs instead of summing to overcome the above problems. A hybrid universal learning network constructed by the combination of multiplication units arid summation units is proposed and trained for several well known benchmark problems. Different combinations of the above two are tried. It is clarified that multiplication is an essential computational element in many cases and the combination of the multiplication units with summation units in different layers in the networks improved the performance of the network.
|ジャーナル||IEEJ Transactions on Electronics, Information and Systems|
|出版ステータス||出版済み - 1 1 2003|
All Science Journal Classification (ASJC) codes
- Electrical and Electronic Engineering