{"title":"利用深度强化学习实现边缘网络中的协作视频缓存","authors":"Anirban Lekharu, Pranav Gupta, Arijit Sur, Moumita Patra","doi":"10.1145/3664613","DOIUrl":null,"url":null,"abstract":"With the enormous growth in mobile data traffic over the 5G environment, Adaptive BitRate (ABR) video streaming has become a challenging problem. Recent advances in Mobile Edge Computing (MEC) technology make it feasible to use Base Stations (BSs) intelligently by network caching, popularity-based video streaming, etc. Additional computing resources on the edge node offer an opportunity to reduce network traffic on the backhaul links during peak traffic hours. More recently, it has been found in the literature that collaborative caching strategies between neighbouring BSs (i.e., MEC servers) make it more efficient to reduce backhaul traffic and network congestion and thus improve the viewer experience substantially. In this work, we propose a Reinforcement Learning (RL) based collaborative caching mechanism where the edge servers cooperate to serve the requested content from the end-users. Specifically, this research aims to improve the overall cache hit rate at the MEC, where the edge servers are clustered based on their geographic locations. The said task is modelled as a multi-objective optimization problem and solved using an RL framework. In addition, a novel cache admission and eviction policy is defined by calculating the priority score of video segments in the clustered MEC mesh network.","PeriodicalId":29764,"journal":{"name":"ACM Transactions on Internet of Things","volume":null,"pages":null},"PeriodicalIF":3.5000,"publicationDate":"2024-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Collaborative Video Caching in the Edge Network using Deep Reinforcement Learning\",\"authors\":\"Anirban Lekharu, Pranav Gupta, Arijit Sur, Moumita Patra\",\"doi\":\"10.1145/3664613\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"With the enormous growth in mobile data traffic over the 5G environment, Adaptive BitRate (ABR) video streaming has become a challenging problem. Recent advances in Mobile Edge Computing (MEC) technology make it feasible to use Base Stations (BSs) intelligently by network caching, popularity-based video streaming, etc. Additional computing resources on the edge node offer an opportunity to reduce network traffic on the backhaul links during peak traffic hours. More recently, it has been found in the literature that collaborative caching strategies between neighbouring BSs (i.e., MEC servers) make it more efficient to reduce backhaul traffic and network congestion and thus improve the viewer experience substantially. In this work, we propose a Reinforcement Learning (RL) based collaborative caching mechanism where the edge servers cooperate to serve the requested content from the end-users. Specifically, this research aims to improve the overall cache hit rate at the MEC, where the edge servers are clustered based on their geographic locations. The said task is modelled as a multi-objective optimization problem and solved using an RL framework. In addition, a novel cache admission and eviction policy is defined by calculating the priority score of video segments in the clustered MEC mesh network.\",\"PeriodicalId\":29764,\"journal\":{\"name\":\"ACM Transactions on Internet of Things\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":3.5000,\"publicationDate\":\"2024-05-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACM Transactions on Internet of Things\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3664613\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Transactions on Internet of Things","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3664613","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
Collaborative Video Caching in the Edge Network using Deep Reinforcement Learning
With the enormous growth in mobile data traffic over the 5G environment, Adaptive BitRate (ABR) video streaming has become a challenging problem. Recent advances in Mobile Edge Computing (MEC) technology make it feasible to use Base Stations (BSs) intelligently by network caching, popularity-based video streaming, etc. Additional computing resources on the edge node offer an opportunity to reduce network traffic on the backhaul links during peak traffic hours. More recently, it has been found in the literature that collaborative caching strategies between neighbouring BSs (i.e., MEC servers) make it more efficient to reduce backhaul traffic and network congestion and thus improve the viewer experience substantially. In this work, we propose a Reinforcement Learning (RL) based collaborative caching mechanism where the edge servers cooperate to serve the requested content from the end-users. Specifically, this research aims to improve the overall cache hit rate at the MEC, where the edge servers are clustered based on their geographic locations. The said task is modelled as a multi-objective optimization problem and solved using an RL framework. In addition, a novel cache admission and eviction policy is defined by calculating the priority score of video segments in the clustered MEC mesh network.