{"title":"基于机器学习的云无线接入网络资源预测与分配","authors":"H. Hesham, G. Yasser, M. Ashour, T. Elshabrawy","doi":"10.1109/NILES50944.2020.9257926","DOIUrl":null,"url":null,"abstract":"With the evolution of 5G and the need to provide on demand services anywhere anytime in radio services, offloading the radio processing to a centralized cloud where all the computation and processing occurs gives flexibility in the allocation and re-allocation of resources to users according to their demand and capacity. This concept is the essence of Cloud Radio Access Networks. With this technology comes two main challenges: firstly, how much resources are required given the system traffic load, and secondly, which resources should be assigned to which user to guarantee the best quality of service at the best resource utilization. Resources in this paper are considered as both physical resources, servers in the cloud, lightweight Remote Radio Heads (RRHs) and bandwidth resources presented in Resource Blocks (RBs). The optimal allocation of these resources dependent on the user traffic is a non-linear optimization problem that is computationally challenging and time consuming to solve. In the presence of the high frame rate, the delay associated with this computational complexity may affect the quality of service. This paper explores different supervised machine learning algorithms in order to predict the amount of RRHs, BBUs and RBs the Cloud Radio Access Network needs, then allocate those resources in order to avoid the high level computation resource allocation usually requires, leading to an overall decrease in the latency in the system and hence a more practical use of the optimal solutions. Machine learning techniques considered include linear, logistic regression, k-means clustering and further improving the allocation using neural networks in comparison to logistic regression. Results show that the different machine learning techniques used for prediction and allocation are accurate in comparison to the test data derived analytically using a heuristic approach.","PeriodicalId":253090,"journal":{"name":"2020 2nd Novel Intelligent and Leading Emerging Sciences Conference (NILES)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2020-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Resource Prediction & Allocation in Cloud Radio Access Networks using Machine Learning\",\"authors\":\"H. Hesham, G. Yasser, M. Ashour, T. Elshabrawy\",\"doi\":\"10.1109/NILES50944.2020.9257926\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"With the evolution of 5G and the need to provide on demand services anywhere anytime in radio services, offloading the radio processing to a centralized cloud where all the computation and processing occurs gives flexibility in the allocation and re-allocation of resources to users according to their demand and capacity. This concept is the essence of Cloud Radio Access Networks. With this technology comes two main challenges: firstly, how much resources are required given the system traffic load, and secondly, which resources should be assigned to which user to guarantee the best quality of service at the best resource utilization. Resources in this paper are considered as both physical resources, servers in the cloud, lightweight Remote Radio Heads (RRHs) and bandwidth resources presented in Resource Blocks (RBs). The optimal allocation of these resources dependent on the user traffic is a non-linear optimization problem that is computationally challenging and time consuming to solve. In the presence of the high frame rate, the delay associated with this computational complexity may affect the quality of service. This paper explores different supervised machine learning algorithms in order to predict the amount of RRHs, BBUs and RBs the Cloud Radio Access Network needs, then allocate those resources in order to avoid the high level computation resource allocation usually requires, leading to an overall decrease in the latency in the system and hence a more practical use of the optimal solutions. Machine learning techniques considered include linear, logistic regression, k-means clustering and further improving the allocation using neural networks in comparison to logistic regression. Results show that the different machine learning techniques used for prediction and allocation are accurate in comparison to the test data derived analytically using a heuristic approach.\",\"PeriodicalId\":253090,\"journal\":{\"name\":\"2020 2nd Novel Intelligent and Leading Emerging Sciences Conference (NILES)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-10-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 2nd Novel Intelligent and Leading Emerging Sciences Conference (NILES)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/NILES50944.2020.9257926\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 2nd Novel Intelligent and Leading Emerging Sciences Conference (NILES)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NILES50944.2020.9257926","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Resource Prediction & Allocation in Cloud Radio Access Networks using Machine Learning
With the evolution of 5G and the need to provide on demand services anywhere anytime in radio services, offloading the radio processing to a centralized cloud where all the computation and processing occurs gives flexibility in the allocation and re-allocation of resources to users according to their demand and capacity. This concept is the essence of Cloud Radio Access Networks. With this technology comes two main challenges: firstly, how much resources are required given the system traffic load, and secondly, which resources should be assigned to which user to guarantee the best quality of service at the best resource utilization. Resources in this paper are considered as both physical resources, servers in the cloud, lightweight Remote Radio Heads (RRHs) and bandwidth resources presented in Resource Blocks (RBs). The optimal allocation of these resources dependent on the user traffic is a non-linear optimization problem that is computationally challenging and time consuming to solve. In the presence of the high frame rate, the delay associated with this computational complexity may affect the quality of service. This paper explores different supervised machine learning algorithms in order to predict the amount of RRHs, BBUs and RBs the Cloud Radio Access Network needs, then allocate those resources in order to avoid the high level computation resource allocation usually requires, leading to an overall decrease in the latency in the system and hence a more practical use of the optimal solutions. Machine learning techniques considered include linear, logistic regression, k-means clustering and further improving the allocation using neural networks in comparison to logistic regression. Results show that the different machine learning techniques used for prediction and allocation are accurate in comparison to the test data derived analytically using a heuristic approach.