Abstract

Efficient image reconstruction, which removes high-density impulse noise from a single corrupted image, is the key technology of computer-vision systems such as for people counting, crowd analysis, action recognition, and human tracking. Recently, a surge in state-of-the-art sparse approximation approaches has occurred in the area of impulse noise removal using corrupted texture information or remnant noise-free information to recover a corrupted image. However, these sparse approximation approaches are recognized to fail when used for an image corrupted with high-density impulse noise. This is because there is insufficient noise-free information within each cropped window for these state-of-the-art sparse approximation approaches to make a sparse approximation. Thus, we propose a novel image restoration approach based on inverse-distance weighting with sparse approximation for high-density impulse noise removal from a single image. The proposed method uses an inverse-distance weighting-based prediction model to produce potential noise-free pixels. Unlike other recent noise removal methods, it does not utilize the corrupted texture information to recover the corrupted image. Evaluations on popular image benchmark datasets show that our image restoration approach has much better performance than the previous state-of-the-art methods, which are more complex and require either corrupted texture or remnant noise-free pixel information.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.