Abstract
Radar interferometry has been widely applied in measuring the terrain height and its changes. The information about surface can be derived from phase interferograms. However, the inherent phase noise reduces the accuracy and reliability of that information. Hence, the minimization of phase noise is essential prior to the retrieval of surface information that is embedded in an interferometric phase. This paper presents a refined filter based on the Lee adaptive complex filter and the improved sigma filter that was originally developed for amplitude image filtering. The basic idea is to adaptively filter the interferometric phase according to the local noise level to minimize the loss of signal for a particular pattern of fringes, including such extreme cases as involving broken fringes, following the removal of undesired pixels. Ultimately, the goals are to preserve the fringe pattern, to reduce phase bias and deviation, to reduce the number of residues, and to minimize the phase error. The preservation of the fringe pattern is particularly of concern in areas of high frequency of fringe and large phase gradient corresponding to steep terrains. The proposed refined filter was validated using both simulated data and real interferometric data. Results demonstrate that the filtering performance is better than that of commonly used filters.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Geoscience and Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.