{"title":"Swarm mutual learning","authors":"Kang Haiyan, Wang Jiakang","doi":"10.1007/s40747-024-01573-2","DOIUrl":null,"url":null,"abstract":"<p>With the rapid growth of big data, extracting meaningful knowledge from data is crucial for machine learning. The existing Swarm Learning data collaboration models face challenges such as data security, model security, high communication overhead, and model performance optimization. To address this, we propose the Swarm Mutual Learning (SML). Firstly, we introduce an Adaptive Mutual Distillation Algorithm that dynamically controls the learning intensity based on distillation weights and strength, enhancing the efficiency of knowledge extraction and transfer during mutual distillation. Secondly, we design a Global Parameter Aggregation Algorithm based on homomorphic encryption, coupled with a Dynamic Gradient Decomposition Algorithm using singular value decomposition. This allows the model to aggregate parameters in ciphertext, significantly reducing communication overhead during uploads and downloads. Finally, we validate the proposed methods on real datasets, demonstrating their effectiveness and efficiency in model updates. On the MNIST dataset and CIFAR-10 dataset, the local model accuracies reached 95.02% and 55.26%, respectively, surpassing those of the comparative models. Furthermore, while ensuring the security of the aggregation process, we significantly reduced the communication overhead for uploading and downloading.</p>","PeriodicalId":10524,"journal":{"name":"Complex & Intelligent Systems","volume":"79 1","pages":""},"PeriodicalIF":5.0000,"publicationDate":"2024-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Complex & Intelligent Systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s40747-024-01573-2","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
With the rapid growth of big data, extracting meaningful knowledge from data is crucial for machine learning. The existing Swarm Learning data collaboration models face challenges such as data security, model security, high communication overhead, and model performance optimization. To address this, we propose the Swarm Mutual Learning (SML). Firstly, we introduce an Adaptive Mutual Distillation Algorithm that dynamically controls the learning intensity based on distillation weights and strength, enhancing the efficiency of knowledge extraction and transfer during mutual distillation. Secondly, we design a Global Parameter Aggregation Algorithm based on homomorphic encryption, coupled with a Dynamic Gradient Decomposition Algorithm using singular value decomposition. This allows the model to aggregate parameters in ciphertext, significantly reducing communication overhead during uploads and downloads. Finally, we validate the proposed methods on real datasets, demonstrating their effectiveness and efficiency in model updates. On the MNIST dataset and CIFAR-10 dataset, the local model accuracies reached 95.02% and 55.26%, respectively, surpassing those of the comparative models. Furthermore, while ensuring the security of the aggregation process, we significantly reduced the communication overhead for uploading and downloading.
期刊介绍:
Complex & Intelligent Systems aims to provide a forum for presenting and discussing novel approaches, tools and techniques meant for attaining a cross-fertilization between the broad fields of complex systems, computational simulation, and intelligent analytics and visualization. The transdisciplinary research that the journal focuses on will expand the boundaries of our understanding by investigating the principles and processes that underlie many of the most profound problems facing society today.