Abstract

We propose error backpropagation using the absolute error function as an objective function. The error backpropagation is the most popular learning algorithm for multi-layered neural networks. In the error backpropagation, the square error function is usually used as the objective function. But a square function has a drawback in which it is enormously large if the data set includes a few anomalous data, which may be observational errors. On the other hand, the absolute error function is less affected by such data. But since the absolute error function is not differentiable, the standard backpropagation's way cannot be applied directly by using an absolute error function as an objective function. Therefore, we first introduce differentiable approximate functions for the absolute value function. The purpose of introducing an approximate function is to construct a differentiable error function which is close to the absolute error function. Then we propose an error backpropagation algorithm minimizing a differentiable approximate error function. Some computational experiments indicate that the proposed method is practically efficient. In particular, it is observed that the method is more robust and learns faster than the backpropagation with the square error function when teacher signals include some incorrect data.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.