Abstract

In estimating logistic regression models, convergence of the maximization algorithm is critical; however, this may fail. Numerous bias correction methods for maximum likelihood estimates of parameters have been conducted for cases of complete data sets, and also for longitudinal models. Balanced data sets yield consistent estimates from conditional logit estimators for binary response panel data models. When faced with a missing covariates problem, researchers adopt various imputation techniques to complete the data and without loss of generality; consistent estimates still suffice asymptotically. For maximum likelihood estimates of the parameters for logistic regression in cases of imputed covariates, the optimal choice of an imputation technique that yields the best estimates with minimum variance is still elusive. This paper aims to examine the behaviour of the Hessian matrix with optimal values of the imputed covariates vector, which will make the Newton–Raphson algorithm converge faster through a reduced absolute value of the product of the score function and the inverse fisher information component. We focus on a method used to modify the conditional likelihood function through the partitioning of the covariate matrix. We also confirm that the positive moduli of the Hessian for conditional estimators are sufficient for the concavity of the log-likelihood function, resulting in optimum parameter estimates. An increased Hessian modulus ensures the faster convergence of the parameter estimates. Simulation results reveal that model-based imputations perform better than classical imputation techniques, yielding estimates with smaller bias and higher precision for the conditional maximum likelihood estimation of nonlinear panel models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call