Abstract

Artificial neural network theory generally minimises a standard statistical error, such as the sum of squared errors, to learn relationships from the presented data. However, applications in business have shown that real forecasting problems require alternative error measures. Errors, identical in magnitude, cause different costs. To reflect this, a set of asymmetric cost functions is proposed as novel error functions for neural network training. Consequently, a neural network minimizes an asymmetric cost function to derive forecasts considered preeminent regarding the original problem. Some experimental results in forecasting a stationary time series using a multilayer perceptron trained with a linear asymmetric cost function are computed, evaluating the performance in competition to basic forecast methods using various error measures.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.