Abstract

We present an effective way to solve the denoising problem of fringe patterns in optics interferometry. The proposed method is based on the topological analysis of an appropriate cost function. To overcome the blurring drawback of the linear diffusion approach, the linear diffusion coefficient at each edge is perturbed successively. The total variation of a discrete cost function can be taken as an indicator function to pick out the most suitable edges of pixels at which the diffusion coefficients are to be perturbed. Then, a filtered image can be obtained by using selected diffusion coefficients associated to the edges. We demonstrate the performance of the proposed method via application to numerically simulated and experimentally obtained fringe patterns.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.