Abstract

We thank Ahrens and Schisterman (henceforth, A&S) for their commentary1 on our article.2 Although it was not our original intention, we are grateful for the invited discussion on the place of causal inference in perinatal and pediatric epidemiology. In response, we briefly offer some clarifications and extensions. A&S claim that we did not adjust for reduced hearing at age 18 months (Y1) in our analysis of the impact of postnatal cellphone exposure (X2) on hearing loss at age 7 years (Y2). In fact, we adjusted for Y1 and other variables listed in the footnotes of Tables 3 and 4 in our article.2 In applying directed acyclic graphs (DAGs),3,4 A&S rightly caution against grouping variables into one node (e.g., B) in the DAG (Figure below or in our article2). This grouping would imply that all arrows pointing into and out of B apply to every variable in B. We used the grouping to avoid clutter and were always mindful of it. A&S raised the issue of a variable in a grouping also being a collider. This is applicable to every collider on a backdoor path and which is selected for confounding control (e.g., A is simultaneously a confounder and a collider with respect to X2→Y2 in the DAG below). To eliminate the collider bias introduced by conditioning on A for confounding control or on X2 (which by being a consequence of the collider A induces conditioning on A),4 we need to have measured variable(s) that can be used to close the open bidirected path between A and Y2 or A and X2. This issue leads us to an important but overlooked result that should be part of the existing causal assumptions: there should be no uncontrolled collider bias before or following confounding control. That is, one must control for any collider bias that arises from using a collider on an open backdoor path to close that backdoor or when the exposure under study is caused by a collider that also lies on an open backdoor path. Figure Directed acyclic graph modified from Sudan et al2 to incorporate uncontrolled confounding between X2 and Y2, unmeasured common causes of A and X2, and of A and Y2, and non-differential independent misclassification of Y2. We agree with A&S on the need for multiple bias modeling.5 We expect to see more of it in the literature as probabilistic bias analysis is increasingly accepted by journals, large data become more available, and investigators routinely use bias formulas6,7 and simulation techniques. We disagree with A&S that bias analysis must be preceded by “placement of the unmeasured confounder in the DAG”1 and that such placement can reveal when “the potential bias is no longer a concern” 1. A known but unmeasured variable should be part of the working DAG from the outset, and not left out until bias analysis. Adding a dashed bidirected arc between the exposure X2 and outcome Y2 in our DAG at the bias analysis stage implies the suspicion of, at least, an unmeasured, possibly unknown, common cause of X2 and outcome Y2. In our article, we triangulated our effect estimate using conventional logistic regression, inverse-probability-weighted (IPW) fitting of marginal structure models (MSM) and doubly robust estimation despite the differences in the qualitative meaning of their effect estimates. This is useful because finding conflicting quantitative results such as reversed effect directions can send warning signals. Importantly, in our article, the different estimates were in the same direction and of similar magnitude. A&S claim that this similarity in magnitude was simply due to minimal confounding in our study. Using hypothetical data with more confounding, we show this claim to be incorrect (see the first three models in the second column of the Table below). A&S then claim that we did not specify how we implemented DRE. Please see the text and Tables 3 and 4 of our article.2 Table Odds ratios (95% confidence limits) for the effect of X2 on Y2 obtained from conventional outcome regression, inverse probability weighted fitting of marginal structural models, doubly robust estimation, and union models using hypothetical data generated ... Given journal space limitations, commentaries can sometimes confusingly oversimplify complex issues. First, AS the others are IPW fitting of MSM, and g-estimation of structural nested models.8,9 Risking oversimplification, we conclude that causal analysis involves estimating well-defined causal effects using (i) (possibly untestable, qualitative) causal assumptions (e.g., no uncontrolled confounding), and (ii) appropriate statistical estimation techniques (e.g. doubly or multiply robust estimation) to remove existing bias without introducing new bias (e.g. handling time-varying confounding, mediation or effect modification without introducing collider bias).3,4,8–10 Even the most sophisticated estimation technique, sans causal assumptions, cannot endow an estimate with causal meaning. Conversely, the simplest conventional regression model coupled with appropriate causal assumptions can be used for causal effect estimation. For details, we defer to our and A&S’s references. Clear and defined research questions guided our analysis and presentation of the results of the difficult, yet important, pursuit of the role of environmental exposures in the health of children.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call