Abstract
AbstractPersonalization is becoming an important aspect of many predictive applications. We introduce a penalized regression method which inherently implements personalization. Personalized angle (PAN) regression constructs regression coefficients that are specific to the covariate vector for which one is producing a prediction, thus personalizing the regression model itself. This is achieved by penalizing the normalized prediction for a given covariate vector. The method therefore penalizes the normalized regression coefficients, or the angles of the regression coefficients in a hyperspherical parametrization, introducing a new angle‐based class of penalties. PAN hence combines two novel concepts: penalizing the normalized coefficients and personalization. For an orthogonal design matrix, we show that the PAN estimator is the solution to a low‐dimensional eigenvector equation. Based on the hyperspherical parametrization, we construct an efficient algorithm to calculate the PAN estimator. We propose a parametric bootstrap procedure for selecting the tuning parameter, and simulations show that PAN regression can outperform ordinary least squares, ridge regression and other penalized regression methods in terms of prediction error. Finally, we demonstrate the method in a medical application.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.