{"title":"在异构数据分布的联邦学习中使用局部学习归一化减少权值发散影响","authors":"Flávio Vieira, Carlos Alberto V. Campos","doi":"10.1016/j.future.2025.107881","DOIUrl":null,"url":null,"abstract":"<div><div>In an increasingly connected world, technologies such as smartphones, 5G, drones, the Internet of Things, and Smart Cities bring new challenges and opportunities. The increase in data collected by these devices and their ease of access allows the use of machine learning techniques to provide intelligent and quality services. Considering these services’ distributed access to data, using Federated Learning is an excellent option for decentralized machine learning processing with greater security. However, client access to heterogeneously distributed data impacts federated learning, reducing test accuracy and increasing communication costs. To address these problems, we present a new method called Federated Learning with Weight Standardization on Convolutional Neural Networks (FedWS) that uses normalization on weights in local training on convolutional neural networks, optimizing the training gradients and reducing the impact of weight divergence caused by heterogeneous distributions on federated training tasks. Results showed that our method got superior results on image classification tasks in the order of 1.36% to 5.0% of test accuracy at different levels of heterogeneity. Their behavior showed a reduction in the effects of divergence at higher levels of heterogeneity and additional communication cost reduction ranging from 20% to 100% due to fast convergence and being more suitable for use in mobile devices with computational resource limitations.</div></div>","PeriodicalId":55132,"journal":{"name":"Future Generation Computer Systems-The International Journal of Escience","volume":"173 ","pages":"Article 107881"},"PeriodicalIF":6.2000,"publicationDate":"2025-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Reducing weight divergence impact using local learning normalization in Federated Learning for heterogeneous data distributions\",\"authors\":\"Flávio Vieira, Carlos Alberto V. Campos\",\"doi\":\"10.1016/j.future.2025.107881\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>In an increasingly connected world, technologies such as smartphones, 5G, drones, the Internet of Things, and Smart Cities bring new challenges and opportunities. The increase in data collected by these devices and their ease of access allows the use of machine learning techniques to provide intelligent and quality services. Considering these services’ distributed access to data, using Federated Learning is an excellent option for decentralized machine learning processing with greater security. However, client access to heterogeneously distributed data impacts federated learning, reducing test accuracy and increasing communication costs. To address these problems, we present a new method called Federated Learning with Weight Standardization on Convolutional Neural Networks (FedWS) that uses normalization on weights in local training on convolutional neural networks, optimizing the training gradients and reducing the impact of weight divergence caused by heterogeneous distributions on federated training tasks. Results showed that our method got superior results on image classification tasks in the order of 1.36% to 5.0% of test accuracy at different levels of heterogeneity. Their behavior showed a reduction in the effects of divergence at higher levels of heterogeneity and additional communication cost reduction ranging from 20% to 100% due to fast convergence and being more suitable for use in mobile devices with computational resource limitations.</div></div>\",\"PeriodicalId\":55132,\"journal\":{\"name\":\"Future Generation Computer Systems-The International Journal of Escience\",\"volume\":\"173 \",\"pages\":\"Article 107881\"},\"PeriodicalIF\":6.2000,\"publicationDate\":\"2025-05-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Future Generation Computer Systems-The International Journal of Escience\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0167739X25001761\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, THEORY & METHODS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Future Generation Computer Systems-The International Journal of Escience","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0167739X25001761","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
Reducing weight divergence impact using local learning normalization in Federated Learning for heterogeneous data distributions
In an increasingly connected world, technologies such as smartphones, 5G, drones, the Internet of Things, and Smart Cities bring new challenges and opportunities. The increase in data collected by these devices and their ease of access allows the use of machine learning techniques to provide intelligent and quality services. Considering these services’ distributed access to data, using Federated Learning is an excellent option for decentralized machine learning processing with greater security. However, client access to heterogeneously distributed data impacts federated learning, reducing test accuracy and increasing communication costs. To address these problems, we present a new method called Federated Learning with Weight Standardization on Convolutional Neural Networks (FedWS) that uses normalization on weights in local training on convolutional neural networks, optimizing the training gradients and reducing the impact of weight divergence caused by heterogeneous distributions on federated training tasks. Results showed that our method got superior results on image classification tasks in the order of 1.36% to 5.0% of test accuracy at different levels of heterogeneity. Their behavior showed a reduction in the effects of divergence at higher levels of heterogeneity and additional communication cost reduction ranging from 20% to 100% due to fast convergence and being more suitable for use in mobile devices with computational resource limitations.
期刊介绍:
Computing infrastructures and systems are constantly evolving, resulting in increasingly complex and collaborative scientific applications. To cope with these advancements, there is a growing need for collaborative tools that can effectively map, control, and execute these applications.
Furthermore, with the explosion of Big Data, there is a requirement for innovative methods and infrastructures to collect, analyze, and derive meaningful insights from the vast amount of data generated. This necessitates the integration of computational and storage capabilities, databases, sensors, and human collaboration.
Future Generation Computer Systems aims to pioneer advancements in distributed systems, collaborative environments, high-performance computing, and Big Data analytics. It strives to stay at the forefront of developments in grids, clouds, and the Internet of Things (IoT) to effectively address the challenges posed by these wide-area, fully distributed sensing and computing systems.