基于多头关注的推荐任务联邦蒸馏方法

IF 0.4 Q4 COMPUTER SCIENCE, INFORMATION SYSTEMS
Aming Wu, Young-Woo Kwon
{"title":"基于多头关注的推荐任务联邦蒸馏方法","authors":"Aming Wu, Young-Woo Kwon","doi":"10.1145/3555776.3577849","DOIUrl":null,"url":null,"abstract":"The key challenges that recommendation systems must overcome are data isolation and privacy protection issues. Federated learning can efficiently train global models using decentralized data while preserving privacy. In real-world applications, however, it is difficult to achieve high prediction accuracy due to the heterogeneity of devices, the lack of data, and the limited generalization capacity of models. In this research, we introduce a personalized federated knowledge distillation model for a recommendation system based on a multi-head attention mechanism for recommendation systems. Specifically, we first employ federated distillation to improve the performance of student models and introduce a multi-head attention mechanism to enhance user encoding information. Next, we incorporate Wasserstein distance into the objective function of combined distillation to reduce the distribution gap between teacher and student networks and also use an adaptive learning rate technique to enhance convergence. We show that the proposed approach achieves better effectiveness and robustness through benchmarks.","PeriodicalId":42971,"journal":{"name":"Applied Computing Review","volume":null,"pages":null},"PeriodicalIF":0.4000,"publicationDate":"2023-03-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"MAFD: A Federated Distillation Approach with Multi-head Attention for Recommendation Tasks\",\"authors\":\"Aming Wu, Young-Woo Kwon\",\"doi\":\"10.1145/3555776.3577849\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The key challenges that recommendation systems must overcome are data isolation and privacy protection issues. Federated learning can efficiently train global models using decentralized data while preserving privacy. In real-world applications, however, it is difficult to achieve high prediction accuracy due to the heterogeneity of devices, the lack of data, and the limited generalization capacity of models. In this research, we introduce a personalized federated knowledge distillation model for a recommendation system based on a multi-head attention mechanism for recommendation systems. Specifically, we first employ federated distillation to improve the performance of student models and introduce a multi-head attention mechanism to enhance user encoding information. Next, we incorporate Wasserstein distance into the objective function of combined distillation to reduce the distribution gap between teacher and student networks and also use an adaptive learning rate technique to enhance convergence. We show that the proposed approach achieves better effectiveness and robustness through benchmarks.\",\"PeriodicalId\":42971,\"journal\":{\"name\":\"Applied Computing Review\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.4000,\"publicationDate\":\"2023-03-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Applied Computing Review\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3555776.3577849\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Computing Review","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3555776.3577849","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

摘要

推荐系统必须克服的关键挑战是数据隔离和隐私保护问题。联邦学习可以使用分散的数据有效地训练全局模型,同时保护隐私。然而,在实际应用中,由于设备的异构性、数据的缺乏以及模型泛化能力的限制,很难达到较高的预测精度。在本研究中,我们引入了一种基于推荐系统多头注意机制的个性化联邦知识蒸馏模型。具体来说,我们首先使用联邦蒸馏来提高学生模型的性能,并引入多头注意机制来增强用户编码信息。接下来,我们将Wasserstein距离引入到联合蒸馏的目标函数中,以减小师生网络之间的分布差距,并使用自适应学习率技术来增强收敛性。通过基准测试表明,该方法具有更好的有效性和鲁棒性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
MAFD: A Federated Distillation Approach with Multi-head Attention for Recommendation Tasks
The key challenges that recommendation systems must overcome are data isolation and privacy protection issues. Federated learning can efficiently train global models using decentralized data while preserving privacy. In real-world applications, however, it is difficult to achieve high prediction accuracy due to the heterogeneity of devices, the lack of data, and the limited generalization capacity of models. In this research, we introduce a personalized federated knowledge distillation model for a recommendation system based on a multi-head attention mechanism for recommendation systems. Specifically, we first employ federated distillation to improve the performance of student models and introduce a multi-head attention mechanism to enhance user encoding information. Next, we incorporate Wasserstein distance into the objective function of combined distillation to reduce the distribution gap between teacher and student networks and also use an adaptive learning rate technique to enhance convergence. We show that the proposed approach achieves better effectiveness and robustness through benchmarks.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Applied Computing Review
Applied Computing Review COMPUTER SCIENCE, INFORMATION SYSTEMS-
自引率
40.00%
发文量
8
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信