Abstract
Forest height and underlying terrain reconstruction is one of the main aims in dealing with forested areas. Theoretically, Synthetic Aperture Radar Tomography (TomoSAR) offers the possibility to solve the layover problem, making it possible to estimate the elevation of scatters located in the same resolution cell. This paper describes a deep learning approach, named Tomographic SAR Neural Network (TSNN), that aims at reconstructing forest and ground height using multipolarimetric multibaseline (MPMB) SAR data and Light Detection and Ranging (LiDAR) based data. The reconstruction of the forest and ground height is formulated as a classification problem, in which TSNN, a feed-forward network, is trained using covariance matrix elements as input vectors and quantized LiDAR-based data as the reference. In our work, TSNN is trained and tested with P-band MPMB data acquired by ONERA over Paracou region of French Guiana in the frame of the European Space Agency’s campaign TROPISAR and LiDAR-based data provided by the French Agricultural Research Center. The novelty of the proposed TSNN is related to its ability to estimate height with a high agreement with LiDAR-based measurement and actual height with no requirement for phase calibration. Experimental results of different covariance window sizes are included to demonstrate TSNN conducts height measurement with high spatial resolution and vertical accuracy outperforming the other two TomoSAR methods. Moreover, the conducted experiments on the effects of phase errors in different ranges show that TSNN has a good tolerance for small errors and is still able to precisely reconstruct forest heights.
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Geoscience and Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.