Abstract
In this work, we present a differentiable neural architecture search (NAS) method that takes into account two competing objectives, quality of result (QoR) and quality of service (QoS) with hardware design constraints. NAS research has recently received a lot of attention due to its ability to automatically find architecture candidates that can outperform handcrafted ones. However, the NAS approach which complies with actual HW design constraints has been under-explored. A naive NAS approach for this would be to optimize a combination of two criteria of QoR and QoS, but the simple extension of the prior art often yields degenerated architectures, and suffers from a sensitive hyperparameter tuning. In this work, we propose a multi-objective differential neural architecture search, called MDARTS. MDARTS has an affordable search time and can find Pareto frontier of QoR versus QoS. We also identify the problematic gap between all the existing differentiable NAS results and those final post-processed architectures, where soft connections are binarized. This gap leads to performance degradation when the model is deployed. To mitigate this gap, we propose a separation loss that discourages indefinite connections of components by implicitly minimizing entropy.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.