Abstract

Principal component Analysis (PCA) is one of the popular methods used to solve the multicollinearity problem. Researchers in 2014 proposed an estimator to solve this problem in the linear model when there were stochastic linear restrictions on the regression coefficients. This estimator was called the stochastic restricted principal components (SRPC) regression estimator. The estimator was constructed by combining the ordinary mixed estimator (OME) and the principal components regression (PCR) estimator. It ignores the number of components (orthogonal matrix Tr) that the researchers choose to solve the multicollinearity problem in the data matrix (X). This paper proposed four different methods (Lagrange function, the same technique, the constrained principal component model, and substitute in model) to modify the (SRPC) estimator to be used in case of multicollinearity. Finally, a numerical example, an application, and simulation study have been introduced to illustrate the performance of the proposed estimator.

Highlights

  • According to the Gauss Markov theorem, the linear regression model (LM) take the form: . [1] (1)where is an n × 1 vector of responses, is an n × p observed matrix of the variables, assumed to have full rank, i.e., rank ( ) =, is a × 1 vector of unknown parameters, and is an n × 1 vector of error terms assumed to be multivariate normally distributed with mean 0 and variance covariance

  • ) ( − M ), this theorem can be proved by four methods: 1. Lagrange function, 2. the same technique which used by researchers in 1961 to get the ordinary mixed estimator (OME) estimator [2], 3. the constrained principal component model, and 4. substitute in model (4) using principal component assumptions

  • In case of multicollinearity problem, the researchers used another forms to estimate the parameters like principal component regression principal components regression (PCR), where this problem occurs when the predictors included in the linear model are highly correlated with each other

Read more

Summary

Introduction

According to the Gauss Markov theorem, the linear regression model (LM) take the form:. It is known that the ordinary least squares (OLS) estimator of is:. The restricted model for can be written as where is an x matrix. Researchers in 1961 used the method to get the Ordinary Mixed Estimator (OME) for the least squares method, where they combined between the LM and the restricted model as follows [2]:. Is the variance of the error term that was found in the restricted model (-./ ( ∗)), where V assumed to be known and positive definite (pd) matrix. From (3) and (6), the equations indicate that adding V to the term ( / ) / Section two presented another view of the SRPC, while section three introduced four different methods for computing the SRPC estimator. The last section introduced a numerical example to show the difference between the old method that introduced by previous papers [3], and the new method that was proposed in this paper

Another View of the CDEFGH Estimator
The Proposed Estimator
The First Method
The Second Method
The Third Method
Numerical Example
Application Case
Simulation Study
Summary
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call