Abstract

Convolutional neural networks (CNNs) have been shown to have the most advanced performance on image segmentation tasks. However, designing a reasonable convolutional neural network architecture requires a great deal of empirical knowledge, which can be expensive or unavailable in a variety of actual applications. To handle this problem, neural architecture search (NAS) has been developed by simultaneously optimizing the network architecture and hyperparameters, the main disadvantage of which is the requirement for extremely high computational resources. To accelerate the NAS of CNNs, a two-stage adaptive lightweight convolutional neural architecture search (AL-CNAS) method is proposed in this article based on an evolutionary algorithm. In this method, the original search space is divided into two parts based on feature grouping, and only one part is optimized at each stage to achieve an exponential decrease in the number of genotype combinations, which helps to improve search efficiency and reduce computational resource requirements. The proposed AL-CNAS is validated on the real-world segmentation of steel microstructure and two benchmark segmentation problems, and the computational results show that the proposed AL-CNAS has good segmentation performance and generality.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call