Abstract
Abstract Kormylo and Mendel proposed a maximum-likelihood deconvolution (MLD) algorithm for estimating a desired sparse spike sequence μ ( k ), modelled as a Bernoulli-Gaussian (BG) sigbal, which was distorted by a linear time-invariant system v ( k ). Then Chi, Mendel and Hampson proposed another MLD algorithm which is a computationally fast MLD algorithm and has been successfully used to process real seismic data. In this paper, we propose an adaptive MLD algorithm, which allows v ( k ) to be a slowly time-varying linear system, for estimating the BG signal μ ( k ) from noisy data. Like the previous MLD algorithms, the proposed adaptive MLD algorithm can also recover the phase of v ( k ) when v ( k ) is time-invariant. Some simulation results are provided to support the proposed algorithm.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.