Abstract
Unsupervised visual representation learning is one of the hottest topics in computer vision, yet performance still lags behind compared with the best supervised learning methods. At the same time, neural architecture search (NAS) has produced state-of-the-art results on various visual tasks. It is a natural idea to explore NAS as a way to improve unsupervised representation learning, yet it remains largely unexplored. In this paper, we propose a Fast and Unsupervised Neural Architecture Evolution (FaUNAE) method to evolve an existing architecture, manually constructed or the result of NAS on a small dataset, to a new architecture that can operate on a larger dataset. This partial optimization can utilize prior knowledge to reduce search cost and improve search efficiency. The evolution is self-supervised where the contrast loss is used as the evaluation metric in a student-teacher framework. By eliminating the inferior or least promising operations, the evolutionary process is greatly accelerated. Experimental results show that we achieve state-of-the-art performance for downstream applications, such as object recognition, object detection, and instance segmentation.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.