Abstract

3D scene reconstruction provides an improved representation from which features of critical objects or targets may be extracted. Both electro-optical (EO) and synthetic aperture radar (SAR) sensors have been exploited for this purpose, but each modality possesses issues resulting in different sources for reconstruction errors. Reconstruction from EO data is limited by frame rate and can be blurred by moving targets or optical distortions in the lens, which leads to errors in the 3D model. Meanwhile, SAR offers the opportunity to correct from some of these errors through its capacity for making range measurements, even under clouds or during nighttime, when EO data would not be available. Conversely, SAR imagery lacks the texture offered by optical images and is more sensitive to perspective, while moving targets can likewise result in reconstruction errors. This work aims at exploiting the strengths of both modalities to reconstruct 3D scenes from multi-sensor EO-SAR data. In particular, we consider the fusion of multi-pass Gotcha SAR data with a modeled EO-data for the particular scene. We propose a framework that fuses 2D image maps acquired from airborne EO data as well as airborne SAR, which leverages the range information of SAR and object shape information of EO imagery. From an initial 2D image of the scene, with each additional sources of sensor data (EO or SAR), a 3D reconstruction is formed that is iteratively improved. This approach allows for the potential to achieve robust and real-time 3D representations as a basis for 4D surveillance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.