This paper discusses the convergence of the numerical solution of the neural network, whose stability is such that any trajectory asymptotically approaches the equilibrium solution. It is shown first that when the connection among neurons is weak, the upper limit of the step width, for which the asynchronous iteration converges, is the same as in the synchronous iteration. It is pointed out that the asynchronous method is expected to achieve a faster convergence in the parallel computer. On the other hand, when the synaptic connection is symmetrical, the asynchronous iteration does not converge in general unless the step width is set smaller than that of the synchronous iteration. It is shown also that the stiffness of the Hopfield network to solve the combinational optimization problem is enhanced with the increase of the network scale. Since the computational complexity is expected to increase rapidly in the forward Euler method, the iterative method based on the backward Euler method is proposed. It is shown that the proposed method has a better convergence property. Finally, the foregoing theoretical results are verified qualitatively through simple examples.
|Number of pages||8|
|Journal||Electronics and Communications in Japan (Part III: Fundamental Electronic Science)|
|Publication status||Published - 1992|
All Science Journal Classification (ASJC) codes
- Electrical and Electronic Engineering