Abstract
AbstractCapsule network's hierarchical framework (CapsNets) consists of an initial standard convolution layer that uses an activation function at its core. The rectified linear unit (ReLU) activation function is widely used in CapsNet and brain tumor classification tasks among several existing activation functions. However, ReLU has some shortcomings where the zero derivatives of the function cause failure of neuron activation. Furthermore, the performance accuracy obtained by the ReLU with CapsNet on brain tumor classification is unsatisfactory. We proposed a new activation function called parametric scaled hyperbolic tangent (PSTanh), which enhances the conventional hyperbolic tangent by avoiding vanishing gradient, provides a small gradient with the introduction of and parameters, and enables faster optimization. Eight standard activation functions (i.e., tanh, Memrister‐Like Activation Function (ReLU), Leaky‐ReLU, PReLU, ELU, SELU, Swish, ReLU‐Memrister‐Like Activation Function (RMAF), and the proposed activation) are analyzed and compared in brain tumor classification tasks. Furthermore, extensive experiments are conducted using MNIST, fashion‐MNIST, CIFAR‐10, CIFAR‐100, and ImageNet datasets trained on CapsNets models and deep CNN models (i.e., AlexNet, SqueezeNet, ResNet50, and DenseNet121). The brain tumor's experimental results based on CapsNet and CNN model show that the proposed PSTanh activation achieves better performance than other functions.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Imaging Systems and Technology
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.