Abstract
This paper presents a model of complex-valued neuron (CVN) for real-valued classification problems, introducing two new activation functions. In this CVN model, each real-valued input is encoded into a phase between 0 and π of a complex number of unity magnitude, and multiplied by a complex-valued weight. The weighted sum of inputs is then fed to an activation function. Both the proposed activation functions map complex values into real values, and their role is to divide the net-input (weighted sum) space into multiple regions representing the classes of input patterns. Gradient-based learning rules are derived for each of the activation functions. The ability of such CVN is discussed and tested with two-class problems, such as two- and three-input Boolean problems, and the symmetry detection in binary sequences. We show here that the CVN with both activation functions can form proper boundaries for these linear and nonlinear problems. For solving n-class problems, a complex-valued neural network (CVNN) consisting of n CVNs is also studied. We defined the one exhibiting the largest output among all the neurons as representing the output class. We tested such single-layered CVNNs on several real world benchmark problems. The results show that the classification ability of single-layered CVNN on unseen data is comparable to the conventional real-valued neural network (RVNN) having one hidden layer. Moreover, convergence of the CVNN is much faster than that of the RVNN in most cases.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.