Abstract
While the linear Pearson correlation coefficient represents a well-established normalized measure to quantify the inter-relation of two stochastic variables X and Y, it fails for multidimensional variables, such as Cartesian coordinates. Avoiding any assumption about the underlying data, the mutual information I(X, Y) does account for multidimensional correlations. However, unlike the normalized Pearson correlation, it has no upper bound (I ∈ [0, ∞)), i.e., it is not clear if say, I = 0.4 corresponds to a low or a high correlation. Moreover, the mutual information (MI) involves the estimation of high-dimensional probability densities (e.g., six-dimensional for Cartesian coordinates), which requires a k nearest-neighbor algorithm, such as the estimator by Kraskov et al. [Phys. Rev. E 69, 066138 (2004)]. As existing methods to normalize the MI cannot be used in connection with this estimator, a new approach is presented, which uses an entropy estimation method that is invariant under variable transformations. The algorithm is numerically efficient and does not require more effort than the calculation of the (un-normalized) MI. After validating the method by applying it to various toy models, the normalized MI between the Cα-coordinates of T4 lysozyme is considered and compared to a correlation analysis of inter-residue contacts.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.