Abstract

One of the challenges in designing a neural network to process complex-valued signals is finding a suitable nonlinear complex activation function. The main reason for this difficulty is the conflict between the boundedness and the differentiability of complex functions in the entire complex plane, stated by Louiville's theorem. To avoid this difficulty, splitting, i.e., using two separate real nonlinear activation functions for the real and imaginary signal components has been the traditional approach. We introduce a feedforward neural network (FNN) architecture employing hyperbolic tangent tanh(z) function defined in the entire complex domain, and compare its performance with the FNN that uses a split complex structure. Since tanh(z) is analytic and bounded almost everywhere in the complex plane, when trained by backpropagation, it can easily outperform the non-analytic split complex activation function in convergence speed and achievable minimum squared error when the domain is bounded around the unit circle. We demonstrate this property by an equalization example, equalization of multi-phase shift keying (MPSK) signals corrupted by a multipath channel. The properties of tanh(z) and future directions to combat nonlinear distortions in complex transmission schemes are discussed.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.