Numéro
J. Phys. France
Volume 51, Numéro 21, novembre 1990
Page(s) 2387 - 2393
DOI https://doi.org/10.1051/jphys:0199000510210238700
J. Phys. France 51, 2387-2393 (1990)
DOI: 10.1051/jphys:0199000510210238700

Performance enhancement of Willshaw type networks through the use of limit cycles

G.A. Kohring

HLRZ an der KFA Jülich, D-5170 Julich, F.R.G.


Abstract
Simulation results of a Willshaw type model for storing sparsely coded patterns are presented. It is suggested that random patterns can be stored in Willshaw type models by transforming them into a set of sparsely coded patterns and retrieving this set as a limit cycle. In this way, the number of steps needed to recall a pattern will be a function of the amount of information the pattern contains. A general algorithm for simulating neural networks with sparsely coded patterns is also discussed, and, on a fully connected network ofN = 36 864 neurons (1.4 x 109 couplings), it is shown to achieve effective updaping speeds as high as 1.6 x 1011 coupling evaluations per second on one Cray-YMP processor.

PACS
8718S - Neural networks.
8710 - General theory and mathematical aspects.
0705M - Neural networks, fuzzy logic, artificial intelligence.

Key words
limit cycles -- neural nets -- neurophysiology