Abstract

Artificial Neural Networks (ANNs) may suffer from suboptimal training and test performance related issues not only because of the presence of high number of features with low statistical contributions but also due to their non-convex nature. This study develops piecewise-linear formulations for the efficient approximation of the non-convex activation and objective functions in artificial neural networks for optimal, global and simultaneous training and feature selection in regression problems. Such formulations include binary variables to account for the existence of the features and piecewise-linear approximations, which in turn, after one exact linearization step, calls for solving a mixed-integer linear programming problem with a global optimum guarantee because of convexity. Suggested formulation is implemented on two industrial case studies. Results show that efficient approximations are obtained through the usage of the method with only a few number of breakpoints. Significant feature space reduction is observed bringing about notable improvement in test accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call