Abstract

Differentiable architecture search requires a larger computational consumption during architecture search, and there exists the depth gap problem under deeper network architecture. In this paper, we propose an attention-based progressive partially connected neural architecture search method (PPCAtt-NAS) to address these two issues. First, we introduce a progressive search strategy in the architecture search phase, build up the sophistication of the architecture gradually and perform path-level pruning in stages to bridge the depth gap. Second, we adopt a partial search scheme that performs channel-level partial sampling of the network architecture to further reduce the computational complexity of the architecture search. In addition, an attention mechanism is devised to improve the architecture search capability by enhancing the relevance between the feature channels. Finally, we conduct extensive comparison experiments with state-of-the-art methods on several public datasets, and our method is able to present higher architecture performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call