Spiking randomly connected neural network (RNN) hardware is promising as ultimately low power devices for temporal data processing at the edge. Although the potential of RNNs for temporal data processing has been demonstrated, randomness of the network architecture often causes performance degradation. To mitigate such degradation, self-organization mechanism using intrinsic plasticity (IP) and synaptic plasticity (SP) should be implemented in the spiking RNN. Therefore, we propose hardware-oriented models of these functions. To implement the function of IP, a variable firing threshold is introduced to each excitatory neuron in the RNN that changes stepwise in accordance with its activity. We also define other thresholds for SP that synchronize with the firing threshold, which determine the direction of stepwise synaptic update that is executed on receiving a pre-synaptic spike. To discuss the effectiveness of our model, we perform simulations of temporal data learning and anomaly detection using publicly available electrocardiograms (ECGs) with a spiking RNN. We observe that the spiking RNN with our IP and SP models realizes the true positive rate of 1 with the false positive rate being suppressed at 0 successfully, which does not occur otherwise. Furthermore, we find that these thresholds as well as the synaptic weights can be reduced to binary if the RNN architecture is appropriately designed. This contributes to minimization of the circuit of the neuronal system having IP and SP.
Read full abstract