Abstract
Quaternionic convolutional neural networks (QCNN) possess the ability to capture both external dependencies between neighboring features and internal latent dependencies within features of an input vector. In this study, we employ QCNN with activation functions based on Bessel-type functions with trainable parameters, for performing classification tasks. Our experimental results demonstrate that this activation function outperforms the traditional ReLU activation function. Throughout our simulations, we explore various network architectures. The use of activation functions with trainable parameters offers several advantages, including enhanced flexibility, adaptability, improved learning, customized model behavior, and automatic feature extraction.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.