Abstract
AbstractThis work investigates the intersection property of conditional independence. It states that for random variables $$A,B,C$$ and X we have that $$X \bot \bot A{\kern 1pt} {\kern 1pt} |{\kern 1pt} {\kern 1pt} B,C$$ and $$X\, \bot \bot\, B{\kern 1pt} {\kern 1pt} |{\kern 1pt} {\kern 1pt} A,C$$ implies $$X\, \bot \bot\, (A,B){\kern 1pt} {\kern 1pt} |{\kern 1pt} {\kern 1pt} C$$. Here, “$$ \bot \bot $$” stands for statistical independence. Under the assumption that the joint distribution has a density that is continuous in $$A,B$$ and C, we provide necessary and sufficient conditions under which the intersection property holds. The result has direct applications to causal inference: it leads to strictly weaker conditions under which the graphical structure becomes identifiable from the joint distribution of an additive noise model.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have