Abstract

Edge-preserving image filtering is a valuable tool for a variety of applications in image processing and computer vision. Motivated by a new simple but effective local Laplacian filter, we propose a scalable and efficient image filtering framework to extend this edge-preserving image filter and construct an uniform implementation in O (N) time. The proposed framework is built upon a practical global-to-local strategy. The input image is first remapped globally by a series of tentative remapping functions to generate a virtual candidate image sequence (Virtual Image Pyramid Sequence, VIPS). This sequence is then recombined locally to a single output image by a flexible edge-aware pixel-level fusion rule. To avoid halo artifacts, both the output image and the virtual candidate image sequence are transformed into multi-resolution pyramid representations. Four examples, single image dehazing, multi-exposure fusion, fast edge-preserving filtering and tone-mapping, are presented as the concrete applications of the proposed framework. Experiments on filtering effect and computational efficiency indicate that the proposed framework is able to build a wide range of fast image filtering that yields visually compelling results.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call