Abstract
To avoid local minimum solutions in back propagation learning, we propose to treat feedforward neural network training as a global optimization problem. In particular, we considered using branch-and-bound based Lipschitz optimization methods in neural network training, and developed globally optimal training algorithms (GOTA). The standard criterion function of a feedforward neural network is Lipschitzian. The effectiveness of GOTA is improved by using dynamically computed local Lipschitz constants over subsets of the weight space. Local search procedures, such as the classic back propagation algorithm, can be incorporated in GOTA. The local search-augmented global algorithms improve the learning efficiency of GOTA while retaining the globally convergent property.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.