Huynh Thi Thanh Binh, Phi-Le Nguyen, B. Nguyen, Trinh Thu Hai, Q. Ngo, D. Son
{"title":"A Reinforcement Learning Algorithm for Resource Provisioning in Mobile Edge Computing Network","authors":"Huynh Thi Thanh Binh, Phi-Le Nguyen, B. Nguyen, Trinh Thu Hai, Q. Ngo, D. Son","doi":"10.1109/IJCNN48605.2020.9206947","DOIUrl":null,"url":null,"abstract":"Mobile edge computing (MEC) is a model that allows integration of computing power into telecommunications networks, to improve communication and data processing efficiency. In general, providing power to ensure the computing power of edge servers in the MEC network is very important. In many cases, ensuring continuous power supply to the system is not possible because servers are deployed in hard-to-reach areas such as outlying areas, forests, islands, etc. This is when renewable energy prevails as a viable source of power for ensuring stable operation. This paper addresses resource provisioning in the MEC network using renewable energy. We formulate the problem as a Markov Decision Problem and introduce a new approach to optimize this problem in terms of energy and time costs by using a reinforcement learning technique. Our simulation validates the efficacy of our algorithm, which results in a cost three times better than the other methods.","PeriodicalId":134599,"journal":{"name":"IEEE International Joint Conference on Neural Network","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE International Joint Conference on Neural Network","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN48605.2020.9206947","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7
Abstract
Mobile edge computing (MEC) is a model that allows integration of computing power into telecommunications networks, to improve communication and data processing efficiency. In general, providing power to ensure the computing power of edge servers in the MEC network is very important. In many cases, ensuring continuous power supply to the system is not possible because servers are deployed in hard-to-reach areas such as outlying areas, forests, islands, etc. This is when renewable energy prevails as a viable source of power for ensuring stable operation. This paper addresses resource provisioning in the MEC network using renewable energy. We formulate the problem as a Markov Decision Problem and introduce a new approach to optimize this problem in terms of energy and time costs by using a reinforcement learning technique. Our simulation validates the efficacy of our algorithm, which results in a cost three times better than the other methods.