Abstract

The aim of the study was evaluate and compare the efficiency of Bayesian and frequentist approach to describe the rumen degradation of NDF. Simulated data was composed by four scenarios: regular restriction in the number of incubation times, random loss of incubation times, loss of specific parts of degradation curves, variation in the precision of the incubations procedures. Two real datasets was used, these real data encompassed the evaluation of NDF degradation of a tropical grass (Brachiaria decumbes). The model was fitted according their characteristics approach and compared by plots and assessors. The Bayesian and frequentist approach presented reliable estimates of degradation parameters for the majority of the data tested. Therefore, in specific cases with short random records number, the Bayesian approach showed greater bias of the estimates of incubation residue and estimates of degradation rate without a biological coherence of the parameters, compared to frequentist inference. In another words, the Bayesian approach fitted with prior diffuse, presented less flexible. Nevertheless, it is emphasized the importance of the background information before the modeling, mainly for the Bayesian approach, in order to define proper prior distributions. Future thorough studies about the influence of non-informative prior for the parameters are necessary.

Highlights

  • The utilization of the frequentist inference, or the classical inference, was almost unanimous among the scientists in the early years of twentieth century

  • Bayesian inference was avoided by researchers for a long time because of the highly complex mathematical resolution, which was not considered viable to be made by using simple algebraic algorithms (LESAFFRE and LAWSON, 2012)

  • At early 1960’s, the Bayesian inference reappeared in a theoretical paper (JEFFREYS, 1961), but just became widely available to be used from 1990’s (GELFAND et al, 1990), when complex integration resolutions could be solved by simulation

Read more

Summary

Introduction

The utilization of the frequentist inference, or the classical inference, was almost unanimous among the scientists in the early years of twentieth century. With the computational improving, the Bayesian inference reappeared as a viable alternative to statistical modeling and analysis. Bayesian inference was avoided by researchers for a long time because of the highly complex mathematical resolution, which was not considered viable to be made by using simple algebraic algorithms (LESAFFRE and LAWSON, 2012). At early 1960’s, the Bayesian inference reappeared in a theoretical paper (JEFFREYS, 1961), but just became widely available to be used from 1990’s (GELFAND et al, 1990), when complex integration resolutions could be solved by simulation.

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call