基于混合共享和网络优化的多任务深度学习方法研究

Hui Guo, Jingchun Guo
{"title":"基于混合共享和网络优化的多任务深度学习方法研究","authors":"Hui Guo, Jingchun Guo","doi":"10.1109/CCIS53392.2021.9754530","DOIUrl":null,"url":null,"abstract":"Multi-task learning which is a branch of deep learning has received extensive attention and in-depth research. However, there are still some difficult problems such as unclear feature sharing, indistinguishable related tasks and overly complex network structure. Therefore, a multi-task learning approach based on hybrid sharing and network optimization is presented. Firstly, training data is fed into the hard parameter sharing network for hybrid training without distinguishing tasks, then the similarity of tasks is measured according to the gradient changes of each task in sharing network layers. Secondly, similar tasks are divided into the same group which is represented by a hard parameter sharing network, while tasks with weak correlation or large differences are divided into different groups which are characterized by soft parameter sharing network. Moreover, it gives a new network training method combining hybrid and alternating, so as to take full advantages of approaches based on the task-level and feature-level. Thirdly, according to the differences of features extracted from the shared layers and the gradient changes in deep layers, the relevant activation value is adjusted and the network is optimized, which not only maintain the conciseness of the network structure, but also help to solve the non-equilibrium problem of data during multi-task learning. Finally, the feasibility and effectiveness of this approach is verified through the applications of MNIST data set and iris and balance data in the UCI data set.","PeriodicalId":191226,"journal":{"name":"2021 IEEE 7th International Conference on Cloud Computing and Intelligent Systems (CCIS)","volume":"49 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Research on Multi-task Deep Learning Approach Based on Hybrid Sharing and Network Optimization\",\"authors\":\"Hui Guo, Jingchun Guo\",\"doi\":\"10.1109/CCIS53392.2021.9754530\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Multi-task learning which is a branch of deep learning has received extensive attention and in-depth research. However, there are still some difficult problems such as unclear feature sharing, indistinguishable related tasks and overly complex network structure. Therefore, a multi-task learning approach based on hybrid sharing and network optimization is presented. Firstly, training data is fed into the hard parameter sharing network for hybrid training without distinguishing tasks, then the similarity of tasks is measured according to the gradient changes of each task in sharing network layers. Secondly, similar tasks are divided into the same group which is represented by a hard parameter sharing network, while tasks with weak correlation or large differences are divided into different groups which are characterized by soft parameter sharing network. Moreover, it gives a new network training method combining hybrid and alternating, so as to take full advantages of approaches based on the task-level and feature-level. Thirdly, according to the differences of features extracted from the shared layers and the gradient changes in deep layers, the relevant activation value is adjusted and the network is optimized, which not only maintain the conciseness of the network structure, but also help to solve the non-equilibrium problem of data during multi-task learning. Finally, the feasibility and effectiveness of this approach is verified through the applications of MNIST data set and iris and balance data in the UCI data set.\",\"PeriodicalId\":191226,\"journal\":{\"name\":\"2021 IEEE 7th International Conference on Cloud Computing and Intelligent Systems (CCIS)\",\"volume\":\"49 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-11-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 IEEE 7th International Conference on Cloud Computing and Intelligent Systems (CCIS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CCIS53392.2021.9754530\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE 7th International Conference on Cloud Computing and Intelligent Systems (CCIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CCIS53392.2021.9754530","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

多任务学习作为深度学习的一个分支受到了广泛的关注和深入的研究。但仍存在特征共享不清晰、相关任务难以区分、网络结构过于复杂等难题。为此,提出了一种基于混合共享和网络优化的多任务学习方法。首先,将训练数据输入硬参数共享网络中进行不区分任务的混合训练,然后根据共享网络各层各任务的梯度变化来度量任务的相似性。其次,将相似的任务划分为同一组,用硬参数共享网络表示;将相关性弱或差异较大的任务划分为不同组,用软参数共享网络表示。并提出了一种混合与交替相结合的网络训练新方法,充分发挥了基于任务级和基于特征级方法的优势。第三,根据共享层提取的特征差异和深层梯度变化,调整相关激活值,优化网络,既保持了网络结构的简明性,又有助于解决多任务学习过程中数据的非均衡问题。最后,通过MNIST数据集和UCI数据集中虹膜和平衡数据的应用验证了该方法的可行性和有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Research on Multi-task Deep Learning Approach Based on Hybrid Sharing and Network Optimization
Multi-task learning which is a branch of deep learning has received extensive attention and in-depth research. However, there are still some difficult problems such as unclear feature sharing, indistinguishable related tasks and overly complex network structure. Therefore, a multi-task learning approach based on hybrid sharing and network optimization is presented. Firstly, training data is fed into the hard parameter sharing network for hybrid training without distinguishing tasks, then the similarity of tasks is measured according to the gradient changes of each task in sharing network layers. Secondly, similar tasks are divided into the same group which is represented by a hard parameter sharing network, while tasks with weak correlation or large differences are divided into different groups which are characterized by soft parameter sharing network. Moreover, it gives a new network training method combining hybrid and alternating, so as to take full advantages of approaches based on the task-level and feature-level. Thirdly, according to the differences of features extracted from the shared layers and the gradient changes in deep layers, the relevant activation value is adjusted and the network is optimized, which not only maintain the conciseness of the network structure, but also help to solve the non-equilibrium problem of data during multi-task learning. Finally, the feasibility and effectiveness of this approach is verified through the applications of MNIST data set and iris and balance data in the UCI data set.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信