Abstract

In this work, a programmable neuron is proposed to approximate the following activation functions: sigmoid, hyperbolic tangent and linear. In other words, the neural network designer can use several control bits to choose the type of activation function without any physical change. The proposed neuron which is simulated in a 0.18 μm CMOS technology, shows a good approximation with maximum error from the ideal hyperbolic tangent function and the ideal sigmoid function by 29.33% and 7.4%, respectively. In order to evaluate the functionality of the neuron, it is applied in two Multi-Layer Perceptron (MLP) neural networks. The first one which is trained to implement XOR gate, is capable of processing signals in the frequency range from 2.5 mHz to 50 MHz. The accuracy of the network is more than 99.9%. The second one is a pattern recognition neural network. Comparison with previous work reveals that there is 48% decrease in the network's power consumption. Moreover, the proposed neuron has been applied in the fully connected (FC) layers of a convolutional neural network (CNN) and an experiment has been conducted on a benchmark dataset MNIST.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call