Abstract

Recently, Deep learning has made a great deal of success in processing images, audios, and natural languages and so on. The activation function is one of the key factors in Deep learning. In this paper, according to characteristics of biological neurons, an improved Leaky Single-Peaked Triangle Linear Unit (LSPTLU) activation function is presented for the right-hand response unbounded of Rectified Linear Unit (ReLU) and Leaky ReLU (LReLU). LSPTLU is more in line with the biological neuron essence and achieves the excellent performance of equivalent or beyond ReLU and LReLU on different datsets, e.g., MNIST, Fashion-MNIST, SVHN, IMAGENET, CALTECH101 and CIFAR10 datasets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call