Abstract

With the rapid development of information science and statistics, significant improvement in the ability to handle huge-sized data has been seen, including linear regression. However, traditional statistical methods remain as a problem when the number of samples is less than the rapidly increasing data dimensions and constructing a mathematical model in high-dimensional scenario may work it out. Hopefully linear regression becomes kind of solvable problem as far as singular matrix is transformed into nonsingular matrix through introduction of L2 norm as penalty term. Nevertheless, this method has to be ungraded by a new method titled ‘Lasso’ when options of variables is inaccessible to it. Coordinate descent method, alternating direction method of multipliers and proximal gradient descent method are gateways to Lasso, among others. The coordinate descent method gets the minimum value of each component on basis of L1 penalty term. Alternating direction method of multipliers the minimum value by processing decomposed objective functions separately. The proximal gradient descent method is a method of Taylor expansion at a point where initial value is selected. This paper has showed the results of simulation experiments on the three algorithms in terms of accuracy and convergence speed. It reveals that alternating direction method of multipliers, superior to the rest two both by speed and accuracy, is followed by The proximal gradient descent method, while The coordinate descent method is not equal.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call