Abstract

This chapter serves as an introduction to concepts from elementary probability theory and information theory in the concrete context of the real line and multi-dimensional Euclidean space. The probabilistic concepts of mean, variance, expected value, marginalization, conditioning, and conditional expectation are reviewed. In this part of the presentation there is some overlap with the previous chapter, which has some pedagogical benefit. There will be no mention of Borel measurability, σ-algebras, filtrations, or martingales, as these are treated in numerous other books on probability theory and stochastic processes such as [1, 14, 15, 32, 27, 48]. The presentation here, while drawing from these excellent works, will be restricted only to those topics that are required either in the mathematical and computational modeling of stochastic physical systems, or the determination of properties of solutions to the equations in these models. Basic concepts of information theory are addressed such as measures of distance, or “divergence,” between probability density functions, and the properties of “information” and entropy. All pdfs treated here will be differentiable functions on Rn. Therefore the entropy and information measures addressed in this chapter are those that are referred to in the literature as the “differential” or “continuous” version.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call