Fangfang Shan , Yuhang Liu , Lulu Fan , Yifan Mao , Zhuo Chen , Shuaifeng Li
{"title":"A survey of optimization algorithms for differential privacy in Federated Learning","authors":"Fangfang Shan , Yuhang Liu , Lulu Fan , Yifan Mao , Zhuo Chen , Shuaifeng Li","doi":"10.1016/j.sysarc.2025.103582","DOIUrl":null,"url":null,"abstract":"<div><div>Federated Learning (FL), as a distributed machine learning approach, enables joint model training without sharing raw data, but during the model update process, the transmission of information still poses potential risks that may lead to the leakage of user privacy. In recent years, differential privacy (DP) techniques have been widely applied to federated learning to enhance data privacy protection. However, the introduction of differential privacy often has a negative impact on model performance, such as reducing model accuracy and increasing training time. Therefore, how to effectively balance privacy protection and model performance in federated learning has become an urgent problem to address. This paper first introduces the basic principles of federated learning and differential privacy, and then focuses on reviewing optimization algorithms for Differential Privacy in Federated Learning (DPFL). Unlike existing reviews on DPFL, we categorize the optimization algorithms into three types: noise mechanism optimization, privacy budget management, and model update optimization. By referencing a large number of related studies, we elaborate on the basic ideas, key innovations, and other aspects of various optimization methods, showcasing their performance and advantages in balancing privacy protection and model performance. Finally, we provide an outlook on future research directions, including further integrating DPFL with other advanced technologies to provide stronger support for applications in complex scenarios, enhancing visualization, and exploring the application of DPFL optimization algorithms in more practical fields.</div></div>","PeriodicalId":50027,"journal":{"name":"Journal of Systems Architecture","volume":"168 ","pages":"Article 103582"},"PeriodicalIF":4.1000,"publicationDate":"2025-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Systems Architecture","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1383762125002541","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE","Score":null,"Total":0}
引用次数: 0
Abstract
Federated Learning (FL), as a distributed machine learning approach, enables joint model training without sharing raw data, but during the model update process, the transmission of information still poses potential risks that may lead to the leakage of user privacy. In recent years, differential privacy (DP) techniques have been widely applied to federated learning to enhance data privacy protection. However, the introduction of differential privacy often has a negative impact on model performance, such as reducing model accuracy and increasing training time. Therefore, how to effectively balance privacy protection and model performance in federated learning has become an urgent problem to address. This paper first introduces the basic principles of federated learning and differential privacy, and then focuses on reviewing optimization algorithms for Differential Privacy in Federated Learning (DPFL). Unlike existing reviews on DPFL, we categorize the optimization algorithms into three types: noise mechanism optimization, privacy budget management, and model update optimization. By referencing a large number of related studies, we elaborate on the basic ideas, key innovations, and other aspects of various optimization methods, showcasing their performance and advantages in balancing privacy protection and model performance. Finally, we provide an outlook on future research directions, including further integrating DPFL with other advanced technologies to provide stronger support for applications in complex scenarios, enhancing visualization, and exploring the application of DPFL optimization algorithms in more practical fields.
期刊介绍:
The Journal of Systems Architecture: Embedded Software Design (JSA) is a journal covering all design and architectural aspects related to embedded systems and software. It ranges from the microarchitecture level via the system software level up to the application-specific architecture level. Aspects such as real-time systems, operating systems, FPGA programming, programming languages, communications (limited to analysis and the software stack), mobile systems, parallel and distributed architectures as well as additional subjects in the computer and system architecture area will fall within the scope of this journal. Technology will not be a main focus, but its use and relevance to particular designs will be. Case studies are welcome but must contribute more than just a design for a particular piece of software.
Design automation of such systems including methodologies, techniques and tools for their design as well as novel designs of software components fall within the scope of this journal. Novel applications that use embedded systems are also central in this journal. While hardware is not a part of this journal hardware/software co-design methods that consider interplay between software and hardware components with and emphasis on software are also relevant here.