Abstract

This paper investigates the efficient solution of penalized quadratic regressions in high-dimensional settings. A novel and efficient algorithm for ridge-penalized quadratic regression is proposed, leveraging the matrix structures of the regression with interactions. Additionally, an alternating direction method of multipliers (ADMM) framework is developed for penalized quadratic regression with general penalties, including both single and hybrid penalty functions. The approach simplifies the calculations to basic matrix-based operations, making it appealing in terms of both memory storage and computational complexity for solving penalized quadratic regressions in high-dimensional settings.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call