Abstract

Let (X, Z) be a bivariate random vector. A predictor of X based on Z is just a Borel function g(Z). The problem of "least squares prediction" of X given the observation Z is to find the global minimum point of the functional E[(X − g(Z))2] with respect to all random variables g(Z), where g is a Borel function. It is well known that the solution of this problem is the conditional expectation E(X|Z). We also know that, if for a nonnegative smooth function F: R×R → R, arg ming(Z)E[F(X, g(Z))] = E[X|Z], for all X and Z, then F(x, y) is a Bregmann loss function. It is also of interest, for a fixed ϕ to find F (x, y), satisfying, arg ming(Z)E[F(X, g(Z))] = ϕ(E[X|Z]), for all X and Z. In more general setting, a stronger problem is to find F (x, y) satisfying arg miny∈RE[F (X, y)] = ϕ(E[X]), ∀X. We study this problem and develop a partial differential equation (PDE) approach to solution of these problems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.