Abstract
Karlin and Studden (1966a) proved that each support point of an admissible approximate experimental design in a linear regression setup maximizes a nonnegative quadratic form in the vector of regression functions. As noted by them, this result becomes trivial whenever the regression function contains a constant term. The main purpose of the present paper is to establish nontrivial necessary conditions for admissibility in these situations. By a characterization of comparability of two information matrices at a time it is proved that the information matrix of an admissible design maximizes a linear function under certain linear restrictions. This implies that it solves an unconstrained nonconstant linear optimization problem, too, yielding nontrivial conditions on the support points similar to those of the Karlin and Studden result. Exemplary, we consider multiple polynomial regression. For the linear setup we give all ‘invariant’ and admissible designs. In the quadratic case (under the additional assumption of convexity of the experimental region K ) the support points of an ‘invariant’ and admissible design are found to be either zero or special boundary points of K .
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.