Abstract

An optimal modification of the classical LSC prediction method is presented, which removes its inherent smoothing effect while sustaining most of its local prediction accuracy at each computation point. Our ‘de-smoothing’ approach is based on a covariance-matching constraint that is imposed on a linear modification of the usual LSC solution so that the final predicted field reproduces the spatial variations patterns implied by an adopted covariance (CV) function model. In addition, an optimal criterion is enforced which minimizes the loss in local prediction accuracy (in the mean squared sense) that occurs during the transformation of the original LSC solution to its CV-matching counterpart. The merit and the main theoretical principles of this signal CV-adaptive technique are analytically explained, and a comparative example with the classical LSC prediction method is given.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call