Abstract
It is well-known that a finite moment generating function (m.g.f.) corresponds to a unique probability distribution. So, an important question arises: Is it possible to obtain an expression of Fisher-information, IX(Ɵ); using the m.g.f. alone, that is without requiring explicitly a probability mass function (p.m.f.) or probability density function (p.d.f.), given that the p.m.f. or p.d.f came from a one-parameter exponential family? We revisit the core of statistical inference by developing a clear link (Theorem 1.1) between the m.g.f. and IX(Ɵ). Illustrations are included.
Highlights
A moment generating function (m.g.f.) is widely used in both probability theory and statistics
Θ is an unknown parameter with a parameter space Θ, a sub-interval of the real line
Suppose that we are given an expression of MX (t; θ) rather than f (x; θ), but we cannot immediately identify an exact nature of the p.m.f or p.d.f. f (x; θ)
Summary
A moment generating function (m.g.f.) is widely used in both probability theory and statistics. Θ is an unknown parameter with a parameter space Θ, a sub-interval of the real line. Where the domain space for t, namely T , is assumed a sub-interval of the real line. We add a number of references: Tanaka (2006), Ghosh (1988), Mukhopadhyay and Banerjee (2013), and Mukhopadhyay (2014). Such discourses invariably bring back the notion of available information in an observation X about an unknown parameter θ to the forefront. That route will nearly demand that we are aware of an exact expression of the p.m.f or p.d.f. which gives rise to a particular m.g.f. on hand assuming the m.g.f. is finite
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have