Jiabin Luo , Qinyu Song , Fusen Guo , Haoyuan Wu , Hafizan Mat Som , Saad Alahmari , Azadeh Noori Hoshyar
{"title":"Joint deep reinforcement learning strategy in MEC for smart internet of vehicles edge computing networks","authors":"Jiabin Luo , Qinyu Song , Fusen Guo , Haoyuan Wu , Hafizan Mat Som , Saad Alahmari , Azadeh Noori Hoshyar","doi":"10.1016/j.suscom.2025.101121","DOIUrl":null,"url":null,"abstract":"<div><div>The Internet of Vehicles (IoV) has a limited computing capacity, making processing computation tasks challenging. These vehicular services are updated through communication and computing platforms. Edge computing is deployed closest to the terminals to extend the cloud computing facilities. However, the limitation of the vehicular edge nodes, satisfying the Quality of Experience (QoE) is the challenge. This paper developed an imaginative IoV scenario supported by mobile edge computing (MEC) by constructing collaborative processes such as task offloading decisions and resource allocation in various roadside units (RSU) environments that cover multiple vehicles. After that, Deep reinforcement Learning (DRL) is employed to solve the joint optimisation issue. Based on this joint optimisation model, the offloading decisions and resource allocations are gained to reduce the cost obtained in end-to-end delay and expense of resource computation. This problem is formulated based on the Markov Decision Process (MDP) designed functions like state, action, and reward. The proposed model's performance evaluations and numerical results achieve less average delay for 30 vehicle nodes in simulation.</div></div>","PeriodicalId":48686,"journal":{"name":"Sustainable Computing-Informatics & Systems","volume":"46 ","pages":"Article 101121"},"PeriodicalIF":3.8000,"publicationDate":"2025-03-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Sustainable Computing-Informatics & Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2210537925000411","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE","Score":null,"Total":0}
引用次数: 0
Abstract
The Internet of Vehicles (IoV) has a limited computing capacity, making processing computation tasks challenging. These vehicular services are updated through communication and computing platforms. Edge computing is deployed closest to the terminals to extend the cloud computing facilities. However, the limitation of the vehicular edge nodes, satisfying the Quality of Experience (QoE) is the challenge. This paper developed an imaginative IoV scenario supported by mobile edge computing (MEC) by constructing collaborative processes such as task offloading decisions and resource allocation in various roadside units (RSU) environments that cover multiple vehicles. After that, Deep reinforcement Learning (DRL) is employed to solve the joint optimisation issue. Based on this joint optimisation model, the offloading decisions and resource allocations are gained to reduce the cost obtained in end-to-end delay and expense of resource computation. This problem is formulated based on the Markov Decision Process (MDP) designed functions like state, action, and reward. The proposed model's performance evaluations and numerical results achieve less average delay for 30 vehicle nodes in simulation.
期刊介绍:
Sustainable computing is a rapidly expanding research area spanning the fields of computer science and engineering, electrical engineering as well as other engineering disciplines. The aim of Sustainable Computing: Informatics and Systems (SUSCOM) is to publish the myriad research findings related to energy-aware and thermal-aware management of computing resource. Equally important is a spectrum of related research issues such as applications of computing that can have ecological and societal impacts. SUSCOM publishes original and timely research papers and survey articles in current areas of power, energy, temperature, and environment related research areas of current importance to readers. SUSCOM has an editorial board comprising prominent researchers from around the world and selects competitively evaluated peer-reviewed papers.