Abstract

The modern theory of statistics makes extensive use of optimization techniques for development and implementation of statistical procedures. For example, the well-known estimation and testing hypotheses problems require optimization. In linear regression, analysis of variance, and design of experiments, extensive use is made of optimization techniques such as least squares, maximum likelihood estimation, and most powerful tests. In the study of linear models with inequality constraints in the parameters, the mathematical programming technique of optimization is required. The Neyman–Pearson theory of testing hypotheses uses techniques of the nonclassical calculus of variations. There are many applications in statistics of dynamic programming, and linear and nonlinear programming. Numerical methods of optimization are utilized when closed form solutions are not available. This chapter focuses on optimization techniques, such as those of Pontryagin maximum principle, simulated annealing, and stochastic approximation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call