A MIC-based acceleration model of Deep Learning

Jian-long Zhang, Xumin Zheng, W. Shen, D. Zhou, Feng Qiu, Huiran Zhang
{"title":"A MIC-based acceleration model of Deep Learning","authors":"Jian-long Zhang, Xumin Zheng, W. Shen, D. Zhou, Feng Qiu, Huiran Zhang","doi":"10.1109/ICALIP.2016.7846603","DOIUrl":null,"url":null,"abstract":"In the era of Computational Intelligence developing rapidly, Deep Learning (DL) has gradually won acceptance from the world of Artificial Intelligence (AI) and it has been widely applied to the industry. However, the training of the network requires a considerable amount of time. For instances, the training of Convolution Neural Network (CNN) and Deep Belief Network (DBN) may take one week or even longer. Therefore, a new challenge has been put forward to the world of Artificial Intelligence, which demands decrease on the training time of Deep Learning algorithm effectively. And in this paper, we proposed a Deep Learning acceleration model based on MIC, which can reduce the training time significantly by using Restricted Boltzmann Machine (RBM) and Logistic Regression (LR). First, it conducts vectorization on the program, and then accelerates it by using the model we proposed in this paper. And the paper mainly consists of the design of the parallel model, which comprising data parallelism, model parallelism, a hybrid of data and model parallelism and so on. And experiments showed that the MIC-based acceleration model can reduce the training time to 1/10 of the original.","PeriodicalId":184170,"journal":{"name":"2016 International Conference on Audio, Language and Image Processing (ICALIP)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 International Conference on Audio, Language and Image Processing (ICALIP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICALIP.2016.7846603","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

In the era of Computational Intelligence developing rapidly, Deep Learning (DL) has gradually won acceptance from the world of Artificial Intelligence (AI) and it has been widely applied to the industry. However, the training of the network requires a considerable amount of time. For instances, the training of Convolution Neural Network (CNN) and Deep Belief Network (DBN) may take one week or even longer. Therefore, a new challenge has been put forward to the world of Artificial Intelligence, which demands decrease on the training time of Deep Learning algorithm effectively. And in this paper, we proposed a Deep Learning acceleration model based on MIC, which can reduce the training time significantly by using Restricted Boltzmann Machine (RBM) and Logistic Regression (LR). First, it conducts vectorization on the program, and then accelerates it by using the model we proposed in this paper. And the paper mainly consists of the design of the parallel model, which comprising data parallelism, model parallelism, a hybrid of data and model parallelism and so on. And experiments showed that the MIC-based acceleration model can reduce the training time to 1/10 of the original.
在计算智能快速发展的时代,深度学习(Deep Learning, DL)逐渐获得了人工智能(AI)领域的认可,并被广泛应用于工业领域。然而,网络的训练需要相当多的时间。例如,卷积神经网络(CNN)和深度信念网络(DBN)的训练可能需要一周甚至更长时间。因此,对人工智能领域提出了新的挑战,这就要求有效地减少深度学习算法的训练时间。首先对程序进行矢量化,然后利用本文提出的模型对程序进行加速。本文主要包括并行模型的设计,包括数据并行、模型并行、数据与模型混合并行等。实验表明,基于mic的加速度模型可以将训练时间减少到原来的1/10。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信