{"title":"整合数据收集、通信和计算,实现重要性感知在线边缘学习任务","authors":"Nan Wang;Yinglei Teng;Kaibin Huang","doi":"10.1109/TWC.2024.3522956","DOIUrl":null,"url":null,"abstract":"With the prevalence of real-time intelligence applications, online edge learning (OEL) has gained increasing attentions due to the ability of rapidly accessing environmental data to improve artificial intelligence models by edge computing. However, the performance of OEL is intricately tied to the dynamic nature of incoming data in ever-changing environments, which does not conform to a stationary distribution. In this work, we develop a data importance-aware collection, communication, and computation integration framework to boost the training efficiency by leveraging the varying data usefulness under dynamic network resources. A model convergence metric (MCM) is firstly derived that quantifies the data importance in mini-batch gradient descent (MGD)-based online learning tasks. To expedite model learning at the edge, we optimize training batch configuration and fine-tune the acquisition of important data through coordinated scheduling, encompassing data sampling, transmission and computational resource allocation. To cope with the time discrepancy and complex coupling of decision variables, we design a two-timescale hierarchical reinforcement learning (TTHRL) algorithm decomposing the original problem into two-layer subproblems and separately optimize the subproblems in a mixed timescale pattern. Experiments show that the proposed data integration framework can effectively improve the online learning efficiency while stabilizing caching queues in the system.","PeriodicalId":13431,"journal":{"name":"IEEE Transactions on Wireless Communications","volume":"24 3","pages":"2606-2619"},"PeriodicalIF":10.7000,"publicationDate":"2025-01-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Integrating Data Collection, Communication, and Computation for Importance-Aware Online Edge Learning Tasks\",\"authors\":\"Nan Wang;Yinglei Teng;Kaibin Huang\",\"doi\":\"10.1109/TWC.2024.3522956\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"With the prevalence of real-time intelligence applications, online edge learning (OEL) has gained increasing attentions due to the ability of rapidly accessing environmental data to improve artificial intelligence models by edge computing. However, the performance of OEL is intricately tied to the dynamic nature of incoming data in ever-changing environments, which does not conform to a stationary distribution. In this work, we develop a data importance-aware collection, communication, and computation integration framework to boost the training efficiency by leveraging the varying data usefulness under dynamic network resources. A model convergence metric (MCM) is firstly derived that quantifies the data importance in mini-batch gradient descent (MGD)-based online learning tasks. To expedite model learning at the edge, we optimize training batch configuration and fine-tune the acquisition of important data through coordinated scheduling, encompassing data sampling, transmission and computational resource allocation. To cope with the time discrepancy and complex coupling of decision variables, we design a two-timescale hierarchical reinforcement learning (TTHRL) algorithm decomposing the original problem into two-layer subproblems and separately optimize the subproblems in a mixed timescale pattern. Experiments show that the proposed data integration framework can effectively improve the online learning efficiency while stabilizing caching queues in the system.\",\"PeriodicalId\":13431,\"journal\":{\"name\":\"IEEE Transactions on Wireless Communications\",\"volume\":\"24 3\",\"pages\":\"2606-2619\"},\"PeriodicalIF\":10.7000,\"publicationDate\":\"2025-01-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Wireless Communications\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10855354/\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Wireless Communications","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10855354/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
Integrating Data Collection, Communication, and Computation for Importance-Aware Online Edge Learning Tasks
With the prevalence of real-time intelligence applications, online edge learning (OEL) has gained increasing attentions due to the ability of rapidly accessing environmental data to improve artificial intelligence models by edge computing. However, the performance of OEL is intricately tied to the dynamic nature of incoming data in ever-changing environments, which does not conform to a stationary distribution. In this work, we develop a data importance-aware collection, communication, and computation integration framework to boost the training efficiency by leveraging the varying data usefulness under dynamic network resources. A model convergence metric (MCM) is firstly derived that quantifies the data importance in mini-batch gradient descent (MGD)-based online learning tasks. To expedite model learning at the edge, we optimize training batch configuration and fine-tune the acquisition of important data through coordinated scheduling, encompassing data sampling, transmission and computational resource allocation. To cope with the time discrepancy and complex coupling of decision variables, we design a two-timescale hierarchical reinforcement learning (TTHRL) algorithm decomposing the original problem into two-layer subproblems and separately optimize the subproblems in a mixed timescale pattern. Experiments show that the proposed data integration framework can effectively improve the online learning efficiency while stabilizing caching queues in the system.
期刊介绍:
The IEEE Transactions on Wireless Communications is a prestigious publication that showcases cutting-edge advancements in wireless communications. It welcomes both theoretical and practical contributions in various areas. The scope of the Transactions encompasses a wide range of topics, including modulation and coding, detection and estimation, propagation and channel characterization, and diversity techniques. The journal also emphasizes the physical and link layer communication aspects of network architectures and protocols.
The journal is open to papers on specific topics or non-traditional topics related to specific application areas. This includes simulation tools and methodologies, orthogonal frequency division multiplexing, MIMO systems, and wireless over optical technologies.
Overall, the IEEE Transactions on Wireless Communications serves as a platform for high-quality manuscripts that push the boundaries of wireless communications and contribute to advancements in the field.