Abstract

Different imaging sensors which are mounted on thousands of remote sensing platforms collect various information of the land covers. Since more sensors provide more information, the classification of multi-sensor data has potential advantages. Multispectral image collects information from visible spectrum, and SAR image reflects information of microwave band. However, due to the redundant information, multi-sensor data will also bring challenge to traditional classification method. This paper presents a joint classification method which combines the information from images from both multispectral sensor and synthetic-aperture radar (SAR) sensor. The proposed method is based on deep feature fusion, which is a deep network with two feature learning branches for multispectral image and SAR image separately, and then, the two branches are merged together into more fully connected layers to perform feature fusion and optimization; finally, a classification layer is added on the top of the network to predict sample label. The proposed method takes advantage of reciprocal information from different sensors and gives a strategy to utilize multi-sensor information. Experimental results demonstrate that our method is able to give better performance than using any single data.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.