Abstract

We introduce projection divergence in the reproducing kernel Hilbert space to test for statistical independence and measure the degree of nonlinear dependence. We suggest a slicing procedure to estimate the kernel projection divergence, which divides a random sample of size n into H slices, each of size c. The entire procedure has the complexity of O(n2), which is prohibitive if n is extremely large. To alleviate computational complexity, we implement this slicing procedure together with a block-wise estimation, which divides the whole sample into B blocks, each of size d. This block-wise and slicing estimation has the complexity of O{n(c+d+logn)}, which reduces the computational complexity substantially if c and d are relatively small. The resultant estimation is asymptotically normal and has the convergence rate of {n(cd)/(c+d)}−1/2. More importantly, this block-wise implementation has the same asymptotic properties as the naive slicing estimation, if c is relatively small, indicating that the block-wise implementation does not result in power loss in independence tests. We demonstrate the computational efficiencies and theoretical properties of this block-wise and slicing estimation through simulations and an application to psychological datasets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call