Abstract

Budbreak is one of the most observed and studied phenological phases in perennial plants, but predictions remain a challenge, largely due to our poor understanding of dormancy. Two dimensions of exposure to temperature are generally used to model budbreak: accumulation of time spent at low temperatures (chilling) and accumulation of heat units (forcing). These two effects have a well-established negative correlation; with more chilling, less forcing is required for budbreak. Furthermore, temperate plant species are assumed to vary in chilling requirements for dormancy completion allowing proper budbreak. Here, dormancy is investigated from the cold hardiness standpoint across many species, demonstrating that it should be accounted for to study dormancy and accurately predict budbreak. Most cold hardiness is lost prior to budbreak, but rates of cold hardiness loss (deacclimation) vary among species, leading to different times to budbreak. Within a species, deacclimation rate increases with accumulation of chill. When inherent differences between species in deacclimation rate are accounted for by normalizing rates throughout winter by the maximum rate observed, a standardized deacclimation potential is produced. Deacclimation potential is a quantitative measurement of dormancy progression based on responsiveness to forcing as chill accumulates, which increases similarly for all species, contradicting estimations of dormancy transition based on budbreak assays. This finding indicates that comparisons of physiologic and genetic control of dormancy require an understanding of cold hardiness dynamics. Thus, an updated framework for studying dormancy and its effects on spring phenology is suggested where cold hardiness in lieu of (or in addition to) budbreak is used.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call