Abstract
We introduce a new error function for backpropagation. The function is designed for binary decision problems in which there are a large number of regular training patterns and a small number of exceptional patterns. We identify three factors that cause the standard quadratic error function to be poorly suited to such problems. We also show that existing alternative error functions, such as cross entropy and Quickprop's error function, do not address all three factors. The principal novelty of our error function is that, as the discrepancy between an output unit's target value and its actual value approaches extreme values, the associated error signal approaches infinity. Simulation results show that this error function learns the N-2-N encoder, a classic exception task, faster and more reliably than the above error functions.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.