Abstract

Abstract The Bayesian approach allows an intuitive way to derive the methods of statistics. Probability is defined as a measure of the plausibility of statements or propositions. Three rules are sufficient to obtain the laws of probability. If the statements refer to the numerical values of variables, the so-called random variables, univariate and multivariate distributions follow. They lead to the point estimation by which unknown quantities, i.e. unknown parameters, are computed from measurements. The unknown parameters are random variables, they are fixed quantities in traditional statistics which is not founded on Bayes’ theorem. Bayesian statistics therefore recommends itself for Monte Carlo methods, which generate random variates from given distributions. Monte Carlo methods, of course, can also be applied in traditional statistics. The unknown parameters, are introduced as functions of the measurements, and the Monte Carlo methods give the covariance matrix and the expectation of these functions. A confidence region is derived where the unknown parameters are situated with a given probability. Following a method of traditional statistics, hypotheses are tested by determining whether a value for an unknown parameter lies inside or outside the confidence region. The error propagation of a random vector by the Monte Carlo methods is presented as an application. If the random vector results from a nonlinearly transformed vector, its covariance matrix and its expectation follow from the Monte Carlo estimate. This saves a considerable amount of derivatives to be computed, and errors of the linearization are avoided. The Monte Carlo method is therefore efficient. If the functions of the measurements are given by a sum of two or more random vectors with different multivariate distributions, the resulting distribution is generally not known. TheMonte Carlo methods are then needed to obtain the covariance matrix and the expectation of the sum.

Highlights

  • The foundation of statistics is built on the theory of probability

  • A confidence region is derived where the unknown parameters are situated with a given probability

  • This means analyzing measurements to estimate unknown phenomena, the so-called unknown parameters, to establish a confidence region where the unknown parameters are situated with a given probability and to test hypotheses introduced for the unknown parameters

Read more

Summary

Introduction

The foundation of statistics is built on the theory of probability. Plausibility or uncertainty are expressed by probability. Bayesian statistics defines the probability for statements or propositions so that the probability is understood as a measure of the plausibility of statements. Bayesian statistics allows an intuitive approach to the method of statistics. This means analyzing measurements to estimate unknown phenomena, the so-called unknown parameters, to establish a confidence region where the unknown parameters are situated with a given probability and to test hypotheses introduced for the unknown parameters. The paper is organized as follows: Section 2 defines Bayesian statistics and Section 3 covers distributions.

Bayesian statistics
Conditional probability
Laws of probability
Generalized Sum Rule
Bayes’ Theorem
Distributions
Continuous Distribution
Multidimensional Continuous Distributions
Marginal Distribution
Conditional Distribution
Generalized Bayes’ Theorem
Variance and Covariance
Estimation and Hypothesis testing
Estimation of Confidence Region
Hypothesis Testing
Error Propagation by Monte Carlo methods
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call