Abstract

This paper introduces an adaptive sampling methodology for automated compression of Deep Neural Networks (DNNs) for accelerated inference on resource-constrained platforms. Modern DNN compression techniques comprise various hyperparameters that require per-layer customization. Our objective is to locate an optimal hyperparameter configuration that leads to lowest model complexity while adhering to a desired inference accuracy. We design a score function that evaluates the aforementioned optimality. The optimization problem is then formulated as searching for the maximizers of this score function. To this end, we devise a non-uniform adaptive sampler that aims at reconstructing the band-limited score function. We reduce the total number of required objective function evaluations by realizing a targeted sampler. We propose three adaptive sampling methodologies, i.e., AdaNS-Zoom, AdaNS-Genetic, and AdaNS-Gaussian, where new batches of samples are chosen based on the history of previous evaluations. Our algorithms start sampling from a uniform distribution over the entire search-space and iteratively adapt the sampling distribution to achieve highest density around the function maxima. This, in turn, allows for a low-error reconstruction of the objective function around its maximizers. Our extensive evaluations corroborate AdaNS effectiveness by outperforming existing rule-based and Reinforcement Learning methods in terms of DNN compression rate and/or inference accuracy.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.