Abstract

The multiple regression model is very popular among researchers in both field of social and science because it is easy to interpret and have a well-established theoretical framework. However, the multioutput multiple regression model is actually widely applied in the engineering field because in the industrial world there are many systems with multiple outputs. The ridge regression model and the Multi-Layer Perceptron (MLP) neural network model are representations of the predictive linear regression model and predictive non-linear regression model that are widely applied in the world of practice. This study aims to build multi-output models of a ridge regression model and an MLP neural network whose hyperparameters are determined by a grid search algorithm through the cross-validation method. The hyperparameter that produces the smallest RMSE value in the validation data is chosen as the hyperparameter to train both models on the training data. The hyperparameter in question is a combination of learning algorithms and alpha values (ridge regression), a combination of the number of hidden nodes and gamma values (MLP neural network). In the ridge regression model for alpha in the range between 0.1 and 0.7, the smallest RMSE is obtained for all learning algorithms used. While the MLP neural network model specifically obtained a combination of the number of nodes = 18 and gamma = 0.1 which produces the smallest RMSE. The ridge regression model with selected hyperparameters has better performance (in the RMSE and R2 value) than the MLP neural network model with selected hyperparameters, both on training and testing data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call