File(s) not publicly available
A neural model for context-dependent sequence learning
journal contribution
posted on 2023-06-07, 19:57 authored by Luc BerthouzeLuc Berthouze, Adriaan TijsselingA novel neural network model is described that implements context-dependent learning of complex sequences. The model utilises leaky integrate-and-fire neurons to extract timing information from its input and modifies its weights using a learning rule with synaptic noise. Learning and recall phases are seamlessly integrated so that the network can gradually shift from learning to predicting its input. Experimental results using data from the real-world problem domain demonstrate that the use of context has three important benefits: (a) it prevents catastrophic interference during learning of multiple overlapping sequences, (b) it enables the completion of sequences from missing or noisy patterns, and (c) it provides a mechanism to selectively explore the space of learned sequences during free recall.
History
Publication status
- Published
Journal
Neural Processing LettersISSN
1370-4621Publisher
Springer VerlagExternal DOI
Issue
1Volume
23Page range
27-45Pages
19.0Department affiliated with
- Informatics Publications
Notes
Originality: Novel neural network model that implements context-dependent learning of complex sequences. Learning and recall phases are seamlessly integrated. Rigour: Mathematical simulations based on data from the public domain. Significance: This biologically inspired model has three key features: it can learn multiple overlapping sequences; it preserves the timing of the sequences; it can complete incomplete or noisy sequences. Impact: The model is directly applicable to real-world applications, e.g., robotic applications, or usable as a component of a larger cognitive system.Full text available
- No
Peer reviewed?
- Yes
Legacy Posted Date
2012-02-06Usage metrics
Categories
No categories selectedKeywords
Licence
Exports
RefWorks
BibTeX
Ref. manager
Endnote
DataCite
NLM
DC