Abstract

We propose a sequence of universal denoisers motivated by the goal of extending the notion of twice-universality from universal data compression theory to the sliding window denoising setting. Given a sequence length <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">n</i> and a denoiser, the <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">k</i> th-order regret of the latter is the maximum excess expected denoising loss relative to sliding window denoisers with window length 2 <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">k</i> +1, where, for a given clean sequence, the expectation is over all channel realizations and the maximum is over all clean sequences of length <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">n</i> . We define the twice-universality penalty of a denoiser as its excess <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">k</i> th-order regret when compared to a bound on the <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">k</i> th-order regret of the denoising algorithm DUDE with parameter <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">k</i> , and we are interested in denoisers with a negligible penalty for all <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">k</i> simultaneously. We consider a class of denoisers that apply one of a number of constituent denoisers based on minimizing an estimated denoising loss and establish a formal relationship between the error in the estimated denoising loss and the twice-universality penalty of the resulting denoiser. Given a sequence of window parameters <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">kn</i> , increasing in <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">n</i> sufficiently fast, we use this approach to construct and analyze a specific sequence of denoisers that achieves a much smaller twice-universality penalty for <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">k</i> <; <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">kn</i> than the sequence of DUDE denoisers with parameter <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">kn</i> .

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.