Abstract
Recently, Deep learning has made a great deal of success in processing images, audios, and natural languages and so on. The activation function is one of the key factors in Deep learning. In this paper, according to characteristics of biological neurons, an improved Leaky Single-Peaked Triangle Linear Unit (LSPTLU) activation function is presented for the right-hand response unbounded of Rectified Linear Unit (ReLU) and Leaky ReLU (LReLU). LSPTLU is more in line with the biological neuron essence and achieves the excellent performance of equivalent or beyond ReLU and LReLU on different datsets, e.g., MNIST, Fashion-MNIST, SVHN, IMAGENET, CALTECH101 and CIFAR10 datasets.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Control, Automation and Systems
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.