An autoencoder-based confederated clustering leveraging a robust model fusion strategy for federated unsupervised learning

IF 14.7 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Nahid Hasan , Md. Golam Rabiul Alam , Shamim H. Ripon , Phuoc Hung Pham , Mohammad Mehedi Hassan
{"title":"An autoencoder-based confederated clustering leveraging a robust model fusion strategy for federated unsupervised learning","authors":"Nahid Hasan ,&nbsp;Md. Golam Rabiul Alam ,&nbsp;Shamim H. Ripon ,&nbsp;Phuoc Hung Pham ,&nbsp;Mohammad Mehedi Hassan","doi":"10.1016/j.inffus.2024.102751","DOIUrl":null,"url":null,"abstract":"<div><div>Concerns related to data privacy, security, and ethical considerations become more prominent as data volumes continue to grow. In contrast to centralized setups, where all data is accessible at a single location, model-based clustering approaches can be successfully employed in federated settings. However, this approach to clustering in federated settings is still relatively unexplored and requires further attention. As federated clustering deals with remote data and requires privacy and security to be maintained, it poses particular challenges as well as possibilities. While model-based clustering offers promise in federated environments, a robust model aggregation method is essential for clustering rather than the generic model aggregation method like Federated Averaging (FedAvg). In this research, we proposed an autoencoder-based clustering method by introducing a novel model aggregation method FednadamN, which is a fusion of Adam and Nadam optimization approaches in a federated learning setting. Therefore, the proposed FednadamN adopted the adaptive learning rates based on the first and second moments of gradients from Adam which offered fast convergence and robustness to noisy data. Furthermore, FednadamN also incorporated the Nesterov-accelerated gradients from Nadam to further enhance the convergence speed and stability. We have studied the performance of the proposed Autoencoder-based clustering methods on benchmark datasets and using the novel FednadamN model aggregation strategy. It shows remarkable performance gain in federated clustering in comparison to the state-of-the-art.</div></div>","PeriodicalId":50367,"journal":{"name":"Information Fusion","volume":"115 ","pages":"Article 102751"},"PeriodicalIF":14.7000,"publicationDate":"2024-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Fusion","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1566253524005293","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Concerns related to data privacy, security, and ethical considerations become more prominent as data volumes continue to grow. In contrast to centralized setups, where all data is accessible at a single location, model-based clustering approaches can be successfully employed in federated settings. However, this approach to clustering in federated settings is still relatively unexplored and requires further attention. As federated clustering deals with remote data and requires privacy and security to be maintained, it poses particular challenges as well as possibilities. While model-based clustering offers promise in federated environments, a robust model aggregation method is essential for clustering rather than the generic model aggregation method like Federated Averaging (FedAvg). In this research, we proposed an autoencoder-based clustering method by introducing a novel model aggregation method FednadamN, which is a fusion of Adam and Nadam optimization approaches in a federated learning setting. Therefore, the proposed FednadamN adopted the adaptive learning rates based on the first and second moments of gradients from Adam which offered fast convergence and robustness to noisy data. Furthermore, FednadamN also incorporated the Nesterov-accelerated gradients from Nadam to further enhance the convergence speed and stability. We have studied the performance of the proposed Autoencoder-based clustering methods on benchmark datasets and using the novel FednadamN model aggregation strategy. It shows remarkable performance gain in federated clustering in comparison to the state-of-the-art.
基于自动编码器的联合聚类,利用稳健的模型融合策略实现联合无监督学习
随着数据量的不断增长,与数据隐私、安全和道德考虑相关的问题变得越来越突出。与集中式设置相比,基于模型的聚类方法可以在联合设置中成功应用,因为在集中式设置中,所有数据都可以在单一位置访问。不过,这种在联合环境中进行聚类的方法相对来说仍未得到探索,需要进一步关注。由于联合聚类处理的是远程数据,并要求维护隐私和安全,因此它既带来了特殊的挑战,也带来了更多的可能性。虽然基于模型的聚类在联合环境中大有可为,但对于聚类来说,必须有一种稳健的模型聚合方法,而不是像联合平均(FedAvg)这样的通用模型聚合方法。在本研究中,我们提出了一种基于自动编码器的聚类方法,引入了一种新颖的模型聚合方法 FednadamN,它是联盟学习环境中 Adam 和 Nadam 优化方法的融合。因此,所提出的 FednadamN 采用了基于 Adam 梯度第一矩和第二矩的自适应学习率,从而提供了快速收敛性和对噪声数据的鲁棒性。此外,FednadamN 还采用了 Nadam 的内斯特罗夫加速梯度,进一步提高了收敛速度和稳定性。我们在基准数据集上研究了所提出的基于自动编码器的聚类方法的性能,并使用了新颖的 FednadamN 模型聚合策略。结果表明,与最先进的方法相比,联合聚类的性能有了明显提高。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Information Fusion
Information Fusion 工程技术-计算机:理论方法
CiteScore
33.20
自引率
4.30%
发文量
161
审稿时长
7.9 months
期刊介绍: Information Fusion serves as a central platform for showcasing advancements in multi-sensor, multi-source, multi-process information fusion, fostering collaboration among diverse disciplines driving its progress. It is the leading outlet for sharing research and development in this field, focusing on architectures, algorithms, and applications. Papers dealing with fundamental theoretical analyses as well as those demonstrating their application to real-world problems will be welcome.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信