Abstract
The problem of classifying a new observation vector into one of the two known groups distributed as multivariate normal with common covariance matrix is considered. In this paper, we handle the situation that the dimension, p, of the observation vectors is less than the total number, N, of observation vectors from the two groups, but both p and N tend to infinity with the same order. Since the inverse of the sample covariance matrix is close to an ill condition in this situation, it may be better to replace it with the inverse of the ridge-type estimator of the covariance matrix in the linear discriminant analysis (LDA). The resulting rule is called the ridge-type linear discriminant analysis (RLDA). The second-order expansion of the expected probability of misclassification (EPMC) for RLDA is derived, and the second-order unbiased estimator of EMPC is given. These results not only provide the corresponding conclusions for LDA, but also clarify the condition that RLDA improves on LDA in terms of EPMC. Finally, the performances of the second-order approximation and the unbiased estimator are investigated by simulation.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.