A genetic algorithm creates new attractors in an associative memory network by pruning synapses adaptively

Akira Imada, Keijiro Araki

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

We apply evolutionary algorithms to neural network model of associative memory. In the model, some of the appropriate configurations of the synaptic weights allow the network to store a number of patterns as an associative memory. For example, the so-called Hebbian rule prescribes one such configuration. However, if the number of patterns to be stored exceeds a critical amount (over-loaded), the ability to store patterns collapses more or less. Or, synaptic weights chosen at random do not have such an ability. In this paper, we describe a genetic algorithm which successfully evolves both the random synapses and over-loaded Hebbian synapses to function as associative memory by adoptively pruning some of the synaptic connections. Although many authors have shown that the model is robust against pruning a fraction of synaptic connections, improvement of performance by pruning has not been explored, as far as we know.

Original languageEnglish
Pages (from-to)1290-1297
Number of pages8
JournalIEICE Transactions on Information and Systems
VolumeE81-D
Issue number11
Publication statusPublished - Jan 1 1998

All Science Journal Classification (ASJC) codes

  • Software
  • Hardware and Architecture
  • Computer Vision and Pattern Recognition
  • Electrical and Electronic Engineering
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'A genetic algorithm creates new attractors in an associative memory network by pruning synapses adaptively'. Together they form a unique fingerprint.

Cite this