Abstract

Most systems can be represented as networks that couple a series of nodes to each other via one or more edges, with typically unknown equations governing their quantitative behaviour. A major question then pertains to the importance of each of the elements that act as system inputs in determining the output(s). We show that any such system can be treated as a ‘communication channel’ for which the associations between inputs and outputs can be quantified via a decomposition of their mutual information into different components characterizing the main effect of individual inputs and their interactions. Unlike variance-based approaches, our novel methodology can easily accommodate correlated inputs.

Highlights

  • The analysis of networks represents a crucial focus of modern systems biology (Barabasi & Oltvai 2004; Hwang et al 2005; Kitano et al 2005; Klipp et al 2005; Wagner 2005; Alon 2006; Davidson 2006; Doyle & Stelling 2006; Kell 2006a,b; Palsson 2006)

  • This procedure is well known in statistics as ‘analysis of variance’ (ANOVA; e.g. Box et al 1978), and several authors have contributed to improve its computational efficiency for sensitivity analysis (e.g. Rabitz & Alis 1999; Sobol 2001)

  • Critchfield et al (1986) defined the mutual information index (MII), which in our notation is the mutual information normalized by the entropy of the output variable: si ðXi; Y H ðY Þ

Read more

Summary

INTRODUCTION

The analysis of networks represents a crucial focus of modern systems biology (Barabasi & Oltvai 2004; Hwang et al 2005; Kitano et al 2005; Klipp et al 2005; Wagner 2005; Alon 2006; Davidson 2006; Doyle & Stelling 2006; Kell 2006a,b; Palsson 2006). We define novel and general sensitivity measures of second and higher order by evaluating input correlations induced by conditioning on the output. We further develop an information-theoretic framework for the sensitivity measures derived, based on the observation that their sum is bounded from above by the output entropy H(Y ) From this viewpoint, the (information-theoretic) sensitivity indices quantify the amount of output uncertainty removed by the knowledge of individual inputs and combinations thereof. We apply the methodology successfully to a model of the NFkB signalling pathway and thereby define how to modify its behaviour to provide a designed maximum effect

METHODS
An information-theoretic first-order sensitivity index
Pairwise interactions
Higher order interactions
The information balance: a summation theorem for sensitivity indices
Total sensitivity indices
First-order sensitivity indices
Second-order sensitivity indices
Third-order interactions
Monte Carlo estimation of discretization entropy
Information balance
CONCLUSIONS
The sign of interaction information
Bias corrected estimates of mutual information
Findings
Decomposition of the total mutual information
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.