{"title":"Enhancing Information Freshness and Energy Efficiency in D2D Networks Through DRL-Based Scheduling and Resource Management","authors":"Parisa Parhizgar;Mehdi Mahdavi;Mohammad Reza Ahmadzadeh;Melike Erol-Kantarci","doi":"10.1109/OJVT.2024.3502803","DOIUrl":null,"url":null,"abstract":"This paper investigates resource management in device-to-device (D2D) networks coexisting with cellular user equipment (CUEs). We introduce a novel model for joint scheduling and resource management in D2D networks, taking into account environmental constraints. To preserve information freshness, measured by minimizing the average age of information (AoI), and to effectively utilize energy harvesting (EH) technology to satisfy the network's energy needs, we formulate an online optimization problem. This formulation considers factors such as the quality of service (QoS) for both CUEs and D2Ds, available power, information freshness, and environmental sensing requirements. Due to the mixed-integer nonlinear nature and online characteristics of the problem, we propose a deep reinforcement learning (DRL) approach to solve it effectively. Numerical results show that the proposed joint scheduling and resource management strategy, utilizing the soft actor-critic (SAC) algorithm, reduces the average AoI by 20% compared to other baseline methods.","PeriodicalId":34270,"journal":{"name":"IEEE Open Journal of Vehicular Technology","volume":"6 ","pages":"52-67"},"PeriodicalIF":5.3000,"publicationDate":"2024-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10758763","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Open Journal of Vehicular Technology","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10758763/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
This paper investigates resource management in device-to-device (D2D) networks coexisting with cellular user equipment (CUEs). We introduce a novel model for joint scheduling and resource management in D2D networks, taking into account environmental constraints. To preserve information freshness, measured by minimizing the average age of information (AoI), and to effectively utilize energy harvesting (EH) technology to satisfy the network's energy needs, we formulate an online optimization problem. This formulation considers factors such as the quality of service (QoS) for both CUEs and D2Ds, available power, information freshness, and environmental sensing requirements. Due to the mixed-integer nonlinear nature and online characteristics of the problem, we propose a deep reinforcement learning (DRL) approach to solve it effectively. Numerical results show that the proposed joint scheduling and resource management strategy, utilizing the soft actor-critic (SAC) algorithm, reduces the average AoI by 20% compared to other baseline methods.