Abstract

Convolutional neural networks (CNNs) are widely used in hyperspectral image (HSI) classification. However, the network architecture of CNNs is often designed manually, which requires careful fine-tuning. Recently, many techniques for neural architecture search (NAS) have been proposed to design the network automatically but most of the methods are only concerned with the overall classification accuracy and ignore the balance between the floating point operations per second (FLOPs) and the number of parameters. In this paper, we propose a new multi-objective optimization (MO) method called MO-CNN to automatically design the structure of CNNs for HSI classification. First, a MO method based on continuous particle swarm optimization (CPSO) is constructed, where the overall accuracy, floating point operations (FLOPs) and the number of parameters are considered, to obtain an optimal architecture from the Pareto front. Then, an auxiliary skip connection strategy is added (together with a partial connection strategy) to avoid performance collapse and to reduce memory consumption. Furthermore, an end-to-end band selection network (BS-Net) is used to reduce redundant bands and to maintain spectral-spatial uniformity. To demonstrate the performance of our newly proposed MO-CNN in scenarios with limited training sets, a quantitative and comparative analysis (including ablation studies) is conducted. Our optimization strategy is shown to improve the classification accuracy, reduce memory and obtain an optimal structure for CNNs based on unbiased datasets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call