Abstract

Methods of incorporating a ridge type of regularization into partial redundancy analysis (PRA), constrained redundancy analysis (CRA), and partial and constrained redundancy analysis (PCRA) were discussed. The usefulness of ridge estimation in reducing mean square error (MSE) has been recognized in multiple regression analysis for some time, especially when predictor variables are nearly collinear, and the ordinary least squares estimator is poorly determined. The ridge estimation method was extended to PRA, CRA, and PCRA, where the reduced rank ridge estimates of regression coefficients were obtained by minimizing the ridge least squares criterion. It was shown that in all cases they could be obtained in closed form for a fixed value of ridge parameter. An optimal value of the ridge parameter is found by G-fold cross validation. Illustrative examples were given to demonstrate the usefulness of the method in practical data analysis situations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call