Abstract
SummaryWe propose a test of independence of two multivariate random vectors, given a sample from the underlying population. Our approach is based on the estimation of mutual information, whose decomposition into joint and marginal entropies facilitates the use of recently developed efficient entropy estimators derived from nearest neighbour distances. The proposed critical values may be obtained by simulation in the case where an approximation to one marginal is available or by permuting the data otherwise. This facilitates size guarantees, and we provide local power analyses, uniformly over classes of densities whose mutual information satisfies a lower bound. Our ideas may be extended to provide new goodness-of-fit tests for normal linear models based on assessing the independence of our vector of covariates and an appropriately defined notion of an error vector. The theory is supported by numerical studies on both simulated and real data.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.