Abstract

This chapter discusses the computational aspects of maximum likelihood estimation. The maximum-likelihood-estimation procedure is a standard statistical estimation technique that sets the parameters of a model equal to numbers θ called estimates. The technique is used in many econometric and statistical-inference problems, including multiple regression, discriminant analysis, and contingency tables. Maximum likelihood also seems to be the best estimation method for discrete choice models. The evaluation of the likelihood function generally involves the calculation of a choice probability for each one of the observations in the data set, which must be done with one of the evaluation methods. The shortcut derivative-evaluation method can be applied to likelihood functions of choice-based samples and random samples with ranked alternatives. For choice-based samples, one can calculate the gradient of the first term of equation with the shortcut method and then add to it the gradient of the second term, which can be easily calculated numerically from other equation. For general likelihood functions, a stationary point can either be a local minimum, a local maximum, or a saddle point, that is, a point that is neither a minimum nor a maximum. The type of stationary point is given by the definiteness of the Hessian matrix. If the Hessian of the log-likelihood function does not have full rank, one cannot tell whether a local maximum or minimum has been reached.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call