Abstract

Dormancy of buds is an important phase in the life cycle of perennial plants growing in environments where unsuitable growth conditions occur seasonally. In regions where low temperature defines these unsuitable conditions, the attainment of cold hardiness is also required for survival. The end of the dormant period culminates in budbreak and flower emergence, or spring phenology, one of the most appreciated and studied phenological events - a time also understood to be most sensitive to low-temperature damage. Despite this, we have a limited physiological and molecular understanding of dormancy, which has negatively affected our ability to model budbreak. This is also true for cold hardiness. Here we highlight the importance of including cold hardiness in dormancy studies that typically only characterize time to budbreak. We show how different temperature treatments may lead to increases in cold hardiness, and by doing so also (potentially inadvertently) increase time to budbreak. We present a theory that describes evaluation of cold hardiness as being key to clarifying physiological changes throughout the dormant period, delineating dormancy statuses, and improving both chill and phenology models. Erroneous interpretations of budbreak datasets are possible by not phenotyping cold hardiness. Changes in cold hardiness were very probably present in previous experiments that studied dormancy, especially when those included below-freezing temperature treatments. Separating the effects between chilling accumulation and cold acclimation in future studies will be essential for increasing our understanding of dormancy and spring phenology in plants.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call