Abstract

Variation in forage composition decreases the accuracy of diets delivered to dairy cows. However, variability of forages can be managed using a renewal reward model (RRM) and genetic algorithm (GA) to optimize sampling and monitoring practices for farm conditions. Specifically, use of quality-control-charts to monitor forage composition can identify changes in composition for which adjustment in the formulated diet will result in a better match of the nutrients delivered to cows. The objectives of this study were 1) assess the use of a clustering algorithm to estimate the mean time the process is stable or in-control (d) (TStable) and the magnitude of the change in forage composition between stable periods (ΔForage) for corn silage and alfalfa-grass silage which are input parameters for the RRM; 2) compare optimized farm-specific sampling practices (number of samples (n), sampling interval (TSample) and control limits (ΔLimit) using previously proposed defaults and our estimates for the TStable and ΔForage input parameters; and 3) conduct a simulation study to compare the number of recommended diet changes costs of quality control under the proposed sampling and monitoring protocols. We estimated the TStable and ΔForage parameters for corn silage NDF and starch and alfalfa-grass silage NDF and CP using a k-means clustering approach applied to forage samples collected from 8 farms, 3x/week during a 16-week period. We compared 4 sampling and monitoring protocols that resulted from the 2 methods for estimating TStable and ΔForage (default values and our proposed method) and either optimizing only the control limit (Optim1) or optimizing the control limits, the number of samples, and the number of days between sampling (Optim2). We simulated the outcomes of implementing the optimized monitoring protocols using a quality control chart for corn silage and alfalfa-grass silage of each farm. Estimates of T^Stable and Δ^Forage from the k-means clustering analysis were, respectively, shorter and larger than previously proposed default values. In the simulated quality control monitoring, larger Δ^Forage estimates increased the optimized ΔLimit resulting in fewer detected shifts in composition of forages and a lower frequency of false alarms and a lower quality control cost ($/d). Recommended diet reformulation intervals from the simulated quality control analysis were specific for the type of forage and farm management practices. The median of the diet reformulation intervals for all farms using our optimal protocols was 14 d (Q1 = 8, Q3 = 26) for corn silage and 16 d (Q1 = 8, Q3 = 26) for alfalfa-grass silage.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call