Abstract

This study tackles the Garden of Forking Paths, as a challenge for replicability and reproducibility of ERP studies. Here, we applied a multiverse analysis to a sample ERP N400 dataset, donated by an independent research team. We analyzed this dataset using 14 pipelines selected to showcase the full range of methodological variability found in the N400 literature using systematic review approach. The selected pipelines were compared in depth by looking into statistical test outcomes, descriptive statistics, effect size, data quality, and statistical power. In this way we provide a worked example of how analytic flexibility can impact results in research fields with high dimensionality such as ERP, when analyzed using standard null-hypothesis significance testing. Out of the methodological decisions that were varied, high-pass filter cut-off, artifact removal method, baseline duration, reference, measurement latency and locations, and amplitude measure (peak vs. mean) were all shown to affect at least some of the study outcome measures. Low-pass filtering was the only step which did not notably influence any of these measures. This study shows that even some of the seemingly minor procedural deviations can influence the conclusions of an ERP study. We demonstrate the power of multiverse analysis in both identifying the most reliable effects in a given study, and for providing insights into consequences of methodological decisions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call