Abstract

Hypotheses involving interactions in which one variable modifies the association between another two are very common. They are typically tested relying on models that assume effects are linear, for example, with a regression like y = a + b x + c z + d x × z. In the real world, however, few effects are linear, invalidating inferences about interactions. For instance, in realistic situations, the false-positive rate can be 100% for detecting an interaction, and a probed interaction can reliably produce estimated effects of the wrong sign. In this article, I propose a revised toolbox for studying interactions in a curvilinear-robust manner, giving correct answers “even” when effects are not linear. It is applicable to most study designs and produces results that are analogous to those of current—often invalid—practices. The presentation combines statistical intuition, demonstrations with published results, and simulations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call