Abstract

The research and development of deep learning cannot be separated from deep neural networks (DNNs). DNNs become deeper and more complex in pursuit of accuracy and precision, leading to significantly increasing inference time and training cost. Existing deep learning frameworks optimize a DNN to improve its runtime performance by transforming computational graphs based on hand-written rules. It is hard to scale when adding some new operators into DNNs. TASO can automatically generate graph substitutions that solve maintainability problems. An optimized graph will be explored by applying a sequence of graph substitutions. However, TASO only considers the runtime performance of the model during the search, which may lose potential optimization. We propose HeuSO, a fine-grained computational graph optimizer with heuristics to handle this problem. HeuSO extracts the type and number of operators of the computational graph and classifies them into four abstract types as high-level features, which facilitate subsequent heuristic search and pruning algorithms. HeuSO generates a better sequence of graph substitutions and finds a better-optimized graph by the heuristic function, which integrates the cost and high-level features of the model. To further reduce the time of searching, HeuSO implements a pruning algorithm. Through high-level specifications, HeuSO can quickly determine whether subgraphs of the original graph match the substitution rules. Evaluations on seven DNNs demonstrate that HeuSO outperforms state-of-the-art frameworks with 2.35 × speedup while accelerating search time by up to 1.58 ×.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call