Associative list memory

Gough, M Paul (1997) Associative list memory. Neural Networks, 10 (6). pp. 1117-1131. ISSN 0893-6080

Full text not available from this repository.


This paper introduces an Associative List Memory (ALM) that has high recall fidelity with low memory and low processing requirements. This permits a simple implementation in software on a personal computer or space instrument microprocessor. Associative List Memory has a performance comparable with Sparse Distributed Memory (SDM) but differs from SDM in that convergence occurs during learning, rather than on recall, and in that the memory is in the form of a dynamic list rather than static randomly distributed locations. Associative List Memory is suitable for unsupervised finding of classes of phenomena in large databases. In particular, all of the class exemplars deduced can be easily accessed at any time to provide a summary of current database knowledge, being essentially the contents of the list. Examples are given where patterns of 1000 bits length with > 30% noise can be learned unsupervised to deduce the original pattern's noise free. A second pass through the data in recall mode can be used to assign to each input the appropriate original pattern, effectively removing all noise from the input data. At large input bit sizes the recall fidelity approaches closely to the maximum possible value. Associative List Memory compares well in recall fidelity with SDM and other associative memories. Its processing times on a personal computer are found to be practical for database applications. Implemented within a space instrument processor, ALM would greatly reduce downlink data transmission rates.

Item Type: Article
Keywords: Artificial neural networks; Associative memory; Pattern recognition; Space instrumentation; Real-time data; analysis; Evolutionary instruments
Schools and Departments: School of Engineering and Informatics > Engineering and Design
Depositing User: Paul Gough
Date Deposited: 06 Feb 2012 21:04
Last Modified: 17 Jul 2012 08:43
📧 Request an update