Abstract
In this paper, we first extend the Wirtinger derivative which is defined for complex functions to hyperbolic functions, and derive the hyperbolic gradient operator yielding the steepest descent direction by using it. Next, we derive the hyperbolic backpropagation learning algorithms for some multilayered hyperbolic neural networks (NNs) using the hyperbolic gradient operator. It is shown that the use of the Wirtinger derivative reduces the effort necessary for the derivation of the learning algorithms by half, simplifies the representation of the learning algorithms, and makes their computer programs easier to code. In addition, we discuss the differences between the derived Hyperbolic-BP rules and the complex-valued backpropagation learning rule (Complex-BP). Finally, we make some experiments with the derived learning algorithms. As a result, we find that the convergence rates of the Hyperbolic-BP learning algorithms are high even if the fully activation functions are used, and discover that the Hyperbolic-BP learning algorithm for the hyperbolic NN with the split-type hyperbolic activation function has an ability to learn hyperbolic rotation as its inherent property.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Neural Networks and Learning Systems
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.