Heuristic Search for DNN Graph Substitutions

Feifei Deng, Hongkang Liu
{"title":"Heuristic Search for DNN Graph Substitutions","authors":"Feifei Deng, Hongkang Liu","doi":"10.1145/3590003.3590044","DOIUrl":null,"url":null,"abstract":"The research and development of deep learning cannot be separated from deep neural networks (DNNs). DNNs become deeper and more complex in pursuit of accuracy and precision, leading to significantly increasing inference time and training cost. Existing deep learning frameworks optimize a DNN to improve its runtime performance by transforming computational graphs based on hand-written rules. It is hard to scale when adding some new operators into DNNs. TASO can automatically generate graph substitutions that solve maintainability problems. An optimized graph will be explored by applying a sequence of graph substitutions. However, TASO only considers the runtime performance of the model during the search, which may lose potential optimization. We propose HeuSO, a fine-grained computational graph optimizer with heuristics to handle this problem. HeuSO extracts the type and number of operators of the computational graph and classifies them into four abstract types as high-level features, which facilitate subsequent heuristic search and pruning algorithms. HeuSO generates a better sequence of graph substitutions and finds a better-optimized graph by the heuristic function, which integrates the cost and high-level features of the model. To further reduce the time of searching, HeuSO implements a pruning algorithm. Through high-level specifications, HeuSO can quickly determine whether subgraphs of the original graph match the substitution rules. Evaluations on seven DNNs demonstrate that HeuSO outperforms state-of-the-art frameworks with 2.35 × speedup while accelerating search time by up to 1.58 ×.","PeriodicalId":340225,"journal":{"name":"Proceedings of the 2023 2nd Asia Conference on Algorithms, Computing and Machine Learning","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2023 2nd Asia Conference on Algorithms, Computing and Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3590003.3590044","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

The research and development of deep learning cannot be separated from deep neural networks (DNNs). DNNs become deeper and more complex in pursuit of accuracy and precision, leading to significantly increasing inference time and training cost. Existing deep learning frameworks optimize a DNN to improve its runtime performance by transforming computational graphs based on hand-written rules. It is hard to scale when adding some new operators into DNNs. TASO can automatically generate graph substitutions that solve maintainability problems. An optimized graph will be explored by applying a sequence of graph substitutions. However, TASO only considers the runtime performance of the model during the search, which may lose potential optimization. We propose HeuSO, a fine-grained computational graph optimizer with heuristics to handle this problem. HeuSO extracts the type and number of operators of the computational graph and classifies them into four abstract types as high-level features, which facilitate subsequent heuristic search and pruning algorithms. HeuSO generates a better sequence of graph substitutions and finds a better-optimized graph by the heuristic function, which integrates the cost and high-level features of the model. To further reduce the time of searching, HeuSO implements a pruning algorithm. Through high-level specifications, HeuSO can quickly determine whether subgraphs of the original graph match the substitution rules. Evaluations on seven DNNs demonstrate that HeuSO outperforms state-of-the-art frameworks with 2.35 × speedup while accelerating search time by up to 1.58 ×.
DNN图替换的启发式搜索
深度学习的研究与发展离不开深度神经网络(deep neural networks, dnn)。dnn在追求准确性和精度的过程中变得越来越深入和复杂,导致推理时间和训练成本显著增加。现有的深度学习框架通过基于手写规则转换计算图来优化DNN以提高其运行时性能。当向dnn中添加一些新的操作符时,很难扩展。TASO可以自动生成解决可维护性问题的图形替换。一个优化的图将通过应用一系列的图替换来探索。然而,TASO在搜索过程中只考虑模型的运行时性能,这可能会失去潜在的优化。我们提出了HeuSO,一个带有启发式的细粒度计算图优化器来处理这个问题。HeuSO提取计算图的运算符的类型和数量,并将其分类为四种抽象类型作为高级特征,方便后续的启发式搜索和修剪算法。HeuSO生成了更好的图替换序列,并通过启发式函数找到了更好的优化图,该函数集成了模型的代价和高级特征。为了进一步减少搜索时间,HeuSO实现了一种剪枝算法。通过高级规范,HeuSO可以快速确定原始图的子图是否匹配替换规则。对7个dnn的评估表明,HeuSO以2.35倍的速度优于最先进的框架,同时将搜索时间加快了1.58倍。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信