Abstract

Differentiable architecture search requires a larger computational consumption during architecture search, and there exists the depth gap problem under deeper network architecture. In this paper, we propose an attention-based progressive partially connected neural architecture search method (PPCAtt-NAS) to address these two issues. First, we introduce a progressive search strategy in the architecture search phase, build up the sophistication of the architecture gradually and perform path-level pruning in stages to bridge the depth gap. Second, we adopt a partial search scheme that performs channel-level partial sampling of the network architecture to further reduce the computational complexity of the architecture search. In addition, an attention mechanism is devised to improve the architecture search capability by enhancing the relevance between the feature channels. Finally, we conduct extensive comparison experiments with state-of-the-art methods on several public datasets, and our method is able to present higher architecture performance.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.