面向移动应用中CNN的上下文感知分布式学习

Zhuwei Qin, Hao Jiang
{"title":"面向移动应用中CNN的上下文感知分布式学习","authors":"Zhuwei Qin, Hao Jiang","doi":"10.1109/SEC50012.2020.00045","DOIUrl":null,"url":null,"abstract":"Intelligent mobile applications have been ubiquitous on mobile devices. These applications keep collecting new and sensitive data from different users while being expected to have the ability to continually adapt the embedded machine learning model to these newly collected data. To improve the quality of service while protecting users’ privacy, distributed mobile learning (e.g., Federated Learning (FedAvg) [1]) has been proposed to offload model training from the cloud to the mobile devices, which enables multiple devices collaboratively train a shared model without leaking the data to the cloud. However, this design becomes impracticable when training the machine learning model (e.g., Convolutional Neural Network (CNN)) on mobile devices with diverse application context. For example, in conventional distributed training schemes, different devices are assumed to have integrated training datasets and train identical CNN model structures. Distributed collaboration between devices is implemented by a straightforward weight average of each identical local models. While, in mobile image classification tasks, different mobile applications have dedicated classification targets depending on individual users’ preference and application specificity. Therefore, directly averaging the model weight of each local model will result in a significant reduction of the test accuracy. To solve this problem, we proposed CAD: a context-aware distributed learning framework for mobile applications, where each mobile device is deployed with a context-adaptive submodel structure instead of the entire global model structure.","PeriodicalId":375577,"journal":{"name":"2020 IEEE/ACM Symposium on Edge Computing (SEC)","volume":"43 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Towards Context-aware Distributed Learning for CNN in Mobile Applications\",\"authors\":\"Zhuwei Qin, Hao Jiang\",\"doi\":\"10.1109/SEC50012.2020.00045\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Intelligent mobile applications have been ubiquitous on mobile devices. These applications keep collecting new and sensitive data from different users while being expected to have the ability to continually adapt the embedded machine learning model to these newly collected data. To improve the quality of service while protecting users’ privacy, distributed mobile learning (e.g., Federated Learning (FedAvg) [1]) has been proposed to offload model training from the cloud to the mobile devices, which enables multiple devices collaboratively train a shared model without leaking the data to the cloud. However, this design becomes impracticable when training the machine learning model (e.g., Convolutional Neural Network (CNN)) on mobile devices with diverse application context. For example, in conventional distributed training schemes, different devices are assumed to have integrated training datasets and train identical CNN model structures. Distributed collaboration between devices is implemented by a straightforward weight average of each identical local models. While, in mobile image classification tasks, different mobile applications have dedicated classification targets depending on individual users’ preference and application specificity. Therefore, directly averaging the model weight of each local model will result in a significant reduction of the test accuracy. To solve this problem, we proposed CAD: a context-aware distributed learning framework for mobile applications, where each mobile device is deployed with a context-adaptive submodel structure instead of the entire global model structure.\",\"PeriodicalId\":375577,\"journal\":{\"name\":\"2020 IEEE/ACM Symposium on Edge Computing (SEC)\",\"volume\":\"43 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 IEEE/ACM Symposium on Edge Computing (SEC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SEC50012.2020.00045\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE/ACM Symposium on Edge Computing (SEC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SEC50012.2020.00045","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

智能移动应用程序在移动设备上无处不在。这些应用程序不断从不同用户那里收集新的敏感数据,同时期望能够不断地使嵌入式机器学习模型适应这些新收集的数据。为了在保护用户隐私的同时提高服务质量,分布式移动学习(例如Federated learning (FedAvg)[1])被提出将模型训练从云端卸载到移动设备上,这使得多个设备能够协同训练共享模型,而不会将数据泄露到云端。然而,当在具有不同应用环境的移动设备上训练机器学习模型(例如卷积神经网络(CNN))时,这种设计变得不切实际。例如,在传统的分布式训练方案中,假设不同的设备具有集成的训练数据集,并且训练相同的CNN模型结构。设备之间的分布式协作是通过每个相同本地模型的直接加权平均来实现的。而在移动图像分类任务中,不同的移动应用根据个人用户的偏好和应用的特殊性有专门的分类目标。因此,直接平均每个局部模型的模型权重将导致测试精度的显著降低。为了解决这个问题,我们提出了CAD:一个用于移动应用程序的上下文感知分布式学习框架,其中每个移动设备都部署了一个上下文自适应的子模型结构,而不是整个全局模型结构。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Towards Context-aware Distributed Learning for CNN in Mobile Applications
Intelligent mobile applications have been ubiquitous on mobile devices. These applications keep collecting new and sensitive data from different users while being expected to have the ability to continually adapt the embedded machine learning model to these newly collected data. To improve the quality of service while protecting users’ privacy, distributed mobile learning (e.g., Federated Learning (FedAvg) [1]) has been proposed to offload model training from the cloud to the mobile devices, which enables multiple devices collaboratively train a shared model without leaking the data to the cloud. However, this design becomes impracticable when training the machine learning model (e.g., Convolutional Neural Network (CNN)) on mobile devices with diverse application context. For example, in conventional distributed training schemes, different devices are assumed to have integrated training datasets and train identical CNN model structures. Distributed collaboration between devices is implemented by a straightforward weight average of each identical local models. While, in mobile image classification tasks, different mobile applications have dedicated classification targets depending on individual users’ preference and application specificity. Therefore, directly averaging the model weight of each local model will result in a significant reduction of the test accuracy. To solve this problem, we proposed CAD: a context-aware distributed learning framework for mobile applications, where each mobile device is deployed with a context-adaptive submodel structure instead of the entire global model structure.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信