Abstract

The role of wetness duration (LWD) and temperature in the development of ripe rot was evaluated with greenhouse grape clusters and detached berries inoculated with Colletotrichum fioriniae. Five classification methods (decision tree, logistic regression, neural network, random forest, and support vector machine) were used to develop environmental risk models for predicting ripe rot from these variables. Additionally, a phenological stage susceptibility model was derived from results of previously published trials. The environmental risk output multiplied by the phenological risk output resulted in the ripe rot disease risk. The models were then tested for their ability to accurately predict the severity of 45 past ripe rot epidemics and were implemented in virtual fungicide application-triggering ripe rot warning systems. In the greenhouse and detached fruit trials, between 27 and 32°C and longer LWD resulted in the highest incidences of ripe rot. The 27°C and 24-h LWD treatment resulted in 100% ripe rot incidence in both greenhouse trials. Four candidate models were proposed for future field efficacy evaluations due to their ability to accurately predict the severity of past ripe rot epidemics while virtually triggering few, an average 2.5 to 3.0, fungicide applications per season. These models could be used to predict ripe rot infection events and inform management decisions for better control with reduced fungicide inputs. [Formula: see text] Copyright © 2023 The Author(s). This is an open access article distributed under the CC BY-NC-ND 4.0 International license .

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call