Abstract

Co-registration is one of the most important steps in interferometric synthetic aperture radar (InSAR) data processing. The standard offset-measurement method based on cross-correlating uniformly distributed patches takes no account of specific geometric transformation between images or characteristics of ground scatterers. Hence, it is inefficient and difficult to obtain satisfying co-registration results for image pairs with relatively big distortion or large incoherent areas. Given this, an improved co-registration strategy is proposed in this paper which takes both the geometric features and image content into consideration. Firstly, some geometric transformations including scale, flip, rotation, and shear between images were eliminated based on the geometrical information, and the initial co-registration polynomial was obtained. Then the registration points were automatically detected by integrating the signal-to-clutter-ratio (SCR) thresholds and the amplitude information, and a further co-registration process was performed to refine the polynomial. Several comparison experiments were carried out using 2 TerraSAR-X data from the Hong Kong airport and 21 PALSAR data from the Donghai Bridge. Experiment results demonstrate that the proposed method brings accuracy and efficiency improvements for co-registration and processing abilities in the cases of big distortion between images or large incoherent areas in the images. For most co-registrations, the proposed method can enhance the reliability and applicability of co-registration and thus promote the automation to a higher level.

Highlights

  • Interferometric synthetic aperture radar (InSAR) technology has been widely used in acquiring topography maps [1] and detecting surface deformations [2,3]

  • That method has difficulties in obtaining satisfying co-registration results in the following special situations: (1) for the cases of image pairs with relatively big distortions, the match patches may suffer from high uncertainty of the estimated offsets in range or azimuth, and (2) for the cases of image pairs with large incoherent areas, the standard method may lead to a high percentage

  • Significant non-translation relationships between images work in some situations

Read more

Summary

Introduction

Interferometric synthetic aperture radar (InSAR) technology has been widely used in acquiring topography maps [1] and detecting surface deformations [2,3]. The standard method estimates the offsets using cross-correlation between image patches and ( a0 · · · a5 , b0 · · · b5 ) are the coefficients of the quadric polynomial function. The standard method estimates the offsets using cross-correlation between image patches and offsets It means that the method is based on the assumption that only a translational relationship exists determines the coefficients in Equation (1) by performing the least square adjustment processing to between each pair of image patches when conducting co-registration. Such accuracy paradox makes the design ofdetermined a co-registration pixel-offset measuring of the patch can be by patch with an appropriate size a challenge when the intersection angle between the satellite orbits or the normal baseline is large. Experimental results indicate that normal baseline, and severe decorrelation of signals and effectively improve the co-registration this method can overcome the co-registration difficulties caused by un-paralleling orbits, large normal accuracy.and severe decorrelation of signals and effectively improve the co-registration accuracy

Method
Workflow
Experiments and Discussions
June 2009
The computational cost ofmethod our method is asas low as
Histogram
Experiment of the PALSAR Data for the Donghai Bridge
13. Co-registration
Findings
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.