Abstract

This article presents concurrent associative memories with synaptic delays useful for processing sequences of real vectors. Associative memories with synaptic delays were introduced by the authors for symbolic sequential inputs and demonstrated several advantages over other sequential memories. They were easy to organize and train. It was demonstrated that they were more robust than long short-term memories in recognition of damaged sequences. The associative memories can be applied in combination with deep neural networks to solve such symbol grounding problems, such as speech recognition, and support sequential memories triggered by sensory inputs. Several practical considerations for developed memories were discussed and illustrated. A continuous speech database was used to compare the developed method with LSTM memories. Tests demonstrated that the developed approach is more robust in recognition of speech sequences, particularly when the test sequences are damaged.

Highlights

  • M EMORIES are essential elements of cognitive systems, providing storing learned knowledge and episodes of its past interactions with the environment [1]

  • The results are compared with the results obtained with the long short-term memories (LSTMs) network, which is preceded by the deep network, and followed by one dense layer used for classification

  • This work described an extension of the symbolic form of synaptic delay associative knowledge graph (SDAKG), where the inputs were represented by sequences of symbols instead of vectors of real sensory data

Read more

Summary

INTRODUCTION

M EMORIES are essential elements of cognitive systems, providing storing learned knowledge and episodes of its past interactions with the environment [1]. Because these neurons transmit their spikes to the polysynchronous group with various delays, if the difference between these delays corresponds to the differences between input activations, the polysynchronous neurons fire [10] This idea was developed in [11] to create an associative network of pulsing neurons that can store and recognize the sequences of activation of neurons that represent the input symbols. SDAKG is a new type of associative memory in which synaptic connections of the self-organizing neural network include information about time delays between input sequence elements. A new algorithm for time-series recognition tasks by the propagation of signals in the separate paths of synaptic gates that represent classes of sequences was created and described in this article This tool was connected with a convolutional neural network on the input to assess the class membership of frames.

SDAKG WITH PARTIALLY ACTIVATED INPUTS
Self-Organization of the Synaptic Gates Memory
Gate Deactivation Rates
PRACTICAL CONSIDERATIONS IN CONCURRENT SDAKG
Uniform Noise Consideration
Scaling Factor
Normal Noise Distribution
Effect of Sequence Length Distribution
Power of Scale Factor Vector
Covariance-Based Similarity
TESTS ON SPEECH DATA SET
Experiments on All Classes of the Resampled Sequences of Phonemes
Experiments With the Three-Layer Perceptron on the Output
TESTS ON THE SIGN LANGUAGE DATA SET
Network Architecture and Results of Sequence Damaging
PROCESSING SPEED
Findings
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call