Abstract

Complex gradient methods have been widely used in learning theory, and typically aim to optimize real-valued functions of complex variables. The stepsize of complex gradient learning methods (CGLMs) is a positive number, and little is known about how a complex stepsize would affect the learning process. To this end, we undertake a comprehensive analysis of CGLMs with a complex stepsize, including the search space, convergence properties, and the dynamics near critical points. Furthermore, several adaptive stepsizes are derived by extending the Barzilai-Borwein method to the complex domain, in order to show that the complex stepsize is superior to the corresponding real one in approximating the information in the Hessian. A numerical example is presented to support the analysis.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.