Abstract

Synergistic applications based on integrated hyperspectral and lidar data are receiving a growing interest from the remote-sensing community. A prerequisite for the optimum sensor fusion of hyperspectral and lidar data is an accurate geometric coalignment. The simple unadjusted integration of lidar elevation and hyperspectral reflectance causes a substantial loss of information and does not exploit the full potential of both sensors. This paper presents a novel approach for the geometric coalignment of hyperspectral and lidar airborne data, based on their respective adopted return intensity information. The complete approach incorporates ray tracing and subpixel procedures in order to overcome grid inherent discretization. It aims at the correction of extrinsic and intrinsic (camera resectioning) parameters of the hyperspectral sensor. In additional to a tie-point-based coregistration, we introduce a ray-tracing-based back projection of the lidar intensities for area-based cost aggregation. The approach consists of three processing steps. First is a coarse automatic tie-point-based boresight alignment. The second step coregisters the hyperspectral data to the lidar intensities. Third is a parametric coalignment refinement with an area-based cost aggregation. This hybrid approach of combining tie-point features and area-based cost aggregation methods for the parametric coregistration of hyperspectral intensity values to their corresponding lidar intensities results in a root-mean-square error of 1/3 pixel. It indicates that a highly integrated and stringent combination of different coalignment methods leads to an improvement of the multisensor coregistration.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.