In this paper, an efficient implicit Lagrangian twin parametric insensitive support vector regression is proposed which leads to a pair of unconstrained minimization problems, motivated by the works on twin parametric insensitive support vector regression (Peng: Neurocomputing. 79, 26–38, 2012), and Lagrangian twin support vector regression (Balasundaram and Tanveer: Neural Comput. Applic. 22(1), 257–267, 2013). Since its objective function is strongly convex, piece-wise quadratic and differentiable, it can be solved by gradient-based iterative methods. Notice that its objective function having non-smooth ‘plus’ function, so one can consider either generalized Hessian, or smooth approximation function to replace the ‘plus’ function and further apply the simple Newton-Armijo step size algorithm. These algorithms can be easily implemented in MATLAB and do not require any optimization toolbox. The advantage of this method is that proposed algorithms take less training time and can deal with data having heteroscedastic noise structure. To demonstrate the effectiveness of the proposed method, computational results are obtained on synthetic and real-world datasets which clearly show comparable generalization performance and improved learning speed in accordance with support vector regression, twin support vector regression, and twin parametric insensitive support vector regression.
Read full abstract