Abstract

Artificial Neural Networks provide models for a large class of natural and artificial phenomena that are difficult to handle using classical parametric techniques. They offer a potential solution to fit all the data, including any outliers, instead of removing them. This paper compares the predictive performance of linear and nonlinear models in outlier detection. The best-subsets regression algorithm for the selection of minimum variables in a linear regression model is used by removing predictors that are irrelevant to the task to be learned. Then, the ANN is trained by the Multi-Layer Perceptron to improve the classification and prediction of the linear model based on standard nonlinear functions which are inherent in ANNs. Comparison of linear and nonlinear models was carried out by analyzing the Receiver Operating Characteristic curves in terms of accuracy and misclassification rates for linear and nonlinear models. The results for linear and nonlinear models achieved 68% and 93%, respectively, with better fit for the nonlinear model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call