Abstract

The approximate theory of optimal linear regression design leads to specific convex extremum problems for numerical solution. A conceptual algorithm is stated, whose concrete versions lead us from steepest descent type algorithms to improved gradient methods, and finally to second order methods with excellent convergence behaviour. Applications are given to symmetric multiple polynomial models of degree three or less, where invariance structures are utilized. A final section is devoted to the construction of efficientexact designs of sizeN from the optimal approximate designs. For the multifactor cubic model and some of the most popular optimality criteria (D-, A-, andI-criteria) fairly efficient exact designs are obtained, even for small sample sizeN.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call