Abstract

The social zeitgeber model (Ehlers, Frank, & Kupfer, 1988) suggests that irregular daily schedules or social rhythms provide vulnerability to bipolar spectrum disorders. This study tested whether social rhythm regularity prospectively predicted first lifetime onset of bipolar spectrum disorders in adolescents already at risk for bipolar disorder based on exhibiting reward hypersensitivity. Adolescents (ages 14-19 years) previously screened to have high (n = 138) or moderate (n = 95) reward sensitivity, but no lifetime history of bipolar spectrum disorder, completed measures of depressive and manic symptoms, family history of bipolar disorder, and the Social Rhythm Metric. They were followed prospectively with semistructured diagnostic interviews every 6 months for an average of 31.7 (SD = 20.1) months. Hierarchical logistic regression indicated that low social rhythm regularity at baseline predicted greater likelihood of first onset of bipolar spectrum disorder over follow-up among high-reward-sensitivity adolescents but not moderate-reward-sensitivity adolescents, controlling for follow-up time, gender, age, family history of bipolar disorder, and initial manic and depressive symptoms (β = -.150, Wald = 4.365, p = .037, odds ratio = .861, 95% confidence interval [.748, .991]). Consistent with the social zeitgeber theory, low social rhythm regularity provides vulnerability to first onset of bipolar spectrum disorder among at-risk adolescents. It may be possible to identify adolescents at risk for developing a bipolar spectrum disorder based on exhibiting both reward hypersensitivity and social rhythm irregularity before onset occurs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call