Abstract

Liquid State Machine (LSM) is a brain-inspired architecture used for solving problems like speech recognition and time series prediction. LSM comprises of a randomly connected recurrent network of spiking neurons. This network propagates the non-linear neuronal and synaptic dynamics. Maass et al. have argued that the non-linear dynamics of LSM is essential for its performance as a universal computer. Lyapunov exponent (µ), used to characterize the non-linearity of the network, correlates well with LSM performance. We propose a complementary approach of approximating the LSM dynamics with a linear state space representation. The spike rates from this model are well correlated to the spike rates from LSM. Such equivalence allows the extraction of a memory metric (τ M ) from the state transition matrix. τ M displays high correlation with performance. Further, high τ M systems require fewer epochs to achieve a given accuracy. Being computationally cheap (1800× time efficient compared to LSM), the τ M metric enables exploration of the vast parameter design space. We observe that the performance correlation of the τ M surpasses that of Lyapunov exponent (µ), (2 − 4× improvement) in the high-performance regime over multiple datasets. In fact, while µ increases monotonically with network activity, the performance reaches a maxima at a specific activity described in literature as the edge of chaos. On the other hand, τ M remains correlated with LSM performance. Hence, τ M captures the useful memory of network activity that enables LSM performance. It also enables rapid design space exploration and fine-tuning of LSM parameters for high performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call