Abstract

The recently proposed twin extreme learning machine (TELM) requires solving two quadratic programming problems (QPPs) in order to find two non-parallel hypersurfaces in the feature that brings in the additional requirement of external optimization toolbox such as MOSEK. In this paper, we propose implicit Lagrangian TELM for classification via unconstrained convex minimization problem (ULTELMC) and further suggest iterative convergent schemes which eliminates the requirement of external optimization toolbox generally required in solving the quadratic programming problems (QPPs) of TELM. The solutions to the dual variables of the proposed ULTELMC are obtained using iterative schemes containing ‘plus’ function which is not differentiable. To overcome this shortcoming, the generalized derivative approach and smooth approximation approaches are suggested. Further, to test the performance of the proposed approaches, classification performances are compared with support vector machine (SVM), twin support vector machine (TWSVM), extreme learning machine (ELM), twin extreme learning machine (TELM) and Lagrangian extreme learning machine (LELM). Moreover, non-requirement to solve QPPs makes the iterative schemes find the solution faster as compared to the reported methods that finds the solution in dual space. Computational times required in finding the solutions are also presented for comparison.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call