Shunfeng Chu;Yiyang Ni;Jun Li;Kang Wei;Jianxin Wang
{"title":"Online Device Scheduling and Model Partition in Hybrid Asynchronous Split Federated Learning","authors":"Shunfeng Chu;Yiyang Ni;Jun Li;Kang Wei;Jianxin Wang","doi":"10.1109/LCOMM.2025.3577786","DOIUrl":null,"url":null,"abstract":"Federated Learning (FL) has attracted significant attention for its capability to collaboratively train neural network (NN) models across multiple data owners while protecting data privacy. However, FL over wireless networks faces two critical challenges, i.e., constrained resources on the device side and stringent synchronous updates across devices. This letter proposes a Hybrid Asynchronous Split FL (HASFL) framework, which combines the strengths of asynchronous FL and split FL, allowing devices to update the model asynchronously and offload partial training tasks to the server. To further enhance the efficiency of HASFL, we formulate a multi-objective optimization problem with long-term constraints aiming at minimizing latency and energy consumption while maintaining the training performance. Furthermore, we propose a novel online scheduling scheme based on the Linear Upper Confidence Bound (EDC-LinUCB) algorithm, which adaptively selects devices and determines the optimal partition layer of the NN model for training in dynamic environments, with theoretical performance validated by a regret analysis. Numerical simulations demonstrate the effectiveness and superiority of the proposed algorithm.","PeriodicalId":13197,"journal":{"name":"IEEE Communications Letters","volume":"29 8","pages":"1869-1873"},"PeriodicalIF":4.4000,"publicationDate":"2025-06-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Communications Letters","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/11028608/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"TELECOMMUNICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
Federated Learning (FL) has attracted significant attention for its capability to collaboratively train neural network (NN) models across multiple data owners while protecting data privacy. However, FL over wireless networks faces two critical challenges, i.e., constrained resources on the device side and stringent synchronous updates across devices. This letter proposes a Hybrid Asynchronous Split FL (HASFL) framework, which combines the strengths of asynchronous FL and split FL, allowing devices to update the model asynchronously and offload partial training tasks to the server. To further enhance the efficiency of HASFL, we formulate a multi-objective optimization problem with long-term constraints aiming at minimizing latency and energy consumption while maintaining the training performance. Furthermore, we propose a novel online scheduling scheme based on the Linear Upper Confidence Bound (EDC-LinUCB) algorithm, which adaptively selects devices and determines the optimal partition layer of the NN model for training in dynamic environments, with theoretical performance validated by a regret analysis. Numerical simulations demonstrate the effectiveness and superiority of the proposed algorithm.
期刊介绍:
The IEEE Communications Letters publishes short papers in a rapid publication cycle on advances in the state-of-the-art of communication over different media and channels including wire, underground, waveguide, optical fiber, and storage channels. Both theoretical contributions (including new techniques, concepts, and analyses) and practical contributions (including system experiments and prototypes, and new applications) are encouraged. This journal focuses on the physical layer and the link layer of communication systems.