LTE-A性能的实验估计

Imane Oussakel, P. Owezarski, Pascal Berthou
{"title":"LTE-A性能的实验估计","authors":"Imane Oussakel, P. Owezarski, Pascal Berthou","doi":"10.23919/CNSM46954.2019.9012663","DOIUrl":null,"url":null,"abstract":"In cellular networks, the emergence of machine communications such as connected vehicles increases the high demand of uplink transmissions, thus, degrading the quality of service per user equipment. Enforcing quality-of-service in such cellular network is challenging, as radio phenomena, as well as user (and their devices) mobility and dynamics, are uncontrolled. To solve this issue, estimating what the quality of transmissions will be in a short future for a connected user is essential. For that purpose, we argue that lower layer metrics are a key feature whose evolution can help predict the bandwidth that the considered connections can take advantage of in the following hundreds of milliseconds. The paper then describes how a 4G testbed has been deployed in order to investigate throughput prediction in uplink transmissions at a small time granularity of 100 ms. Based on lower layer metrics (physical and mac layers), the main supervised machine learning algorithms are used, such as Linear Regressor and Random Forest to predict the uplink received bandwidth in different radio phenomena environment. Hence, a deep investigation of the impact of radio issues on bandwidth prediction is conducted. Further, our evaluation shows that the prediction is highly accurate: at the time granularity of 100 ms, the average prediction error is in the range of 6% to 12% for all the scenarios we explored.","PeriodicalId":273818,"journal":{"name":"2019 15th International Conference on Network and Service Management (CNSM)","volume":"76 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Experimental Estimation of LTE-A Performance\",\"authors\":\"Imane Oussakel, P. Owezarski, Pascal Berthou\",\"doi\":\"10.23919/CNSM46954.2019.9012663\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In cellular networks, the emergence of machine communications such as connected vehicles increases the high demand of uplink transmissions, thus, degrading the quality of service per user equipment. Enforcing quality-of-service in such cellular network is challenging, as radio phenomena, as well as user (and their devices) mobility and dynamics, are uncontrolled. To solve this issue, estimating what the quality of transmissions will be in a short future for a connected user is essential. For that purpose, we argue that lower layer metrics are a key feature whose evolution can help predict the bandwidth that the considered connections can take advantage of in the following hundreds of milliseconds. The paper then describes how a 4G testbed has been deployed in order to investigate throughput prediction in uplink transmissions at a small time granularity of 100 ms. Based on lower layer metrics (physical and mac layers), the main supervised machine learning algorithms are used, such as Linear Regressor and Random Forest to predict the uplink received bandwidth in different radio phenomena environment. Hence, a deep investigation of the impact of radio issues on bandwidth prediction is conducted. Further, our evaluation shows that the prediction is highly accurate: at the time granularity of 100 ms, the average prediction error is in the range of 6% to 12% for all the scenarios we explored.\",\"PeriodicalId\":273818,\"journal\":{\"name\":\"2019 15th International Conference on Network and Service Management (CNSM)\",\"volume\":\"76 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 15th International Conference on Network and Service Management (CNSM)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.23919/CNSM46954.2019.9012663\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 15th International Conference on Network and Service Management (CNSM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.23919/CNSM46954.2019.9012663","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

摘要

在蜂窝网络中,机器通信(如联网车辆)的出现增加了对上行传输的高需求,从而降低了每个用户设备的服务质量。在这种蜂窝网络中加强服务质量是具有挑战性的,因为无线电现象以及用户(及其设备)的移动性和动态是不受控制的。为了解决这个问题,估计一个连接用户在不久的将来的传输质量是至关重要的。为此,我们认为较低层指标是一个关键特性,它的演变可以帮助预测所考虑的连接在接下来的数百毫秒内可以利用的带宽。然后,本文描述了如何部署4G测试平台,以便在100ms的小时间粒度下研究上行传输的吞吐量预测。基于底层指标(物理层和mac层),使用线性回归和随机森林等主要的监督机器学习算法来预测不同无线电现象环境下的上行链路接收带宽。因此,对无线电问题对带宽预测的影响进行了深入研究。此外,我们的评估表明预测非常准确:在100 ms的时间粒度下,我们探索的所有场景的平均预测误差在6%到12%的范围内。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Experimental Estimation of LTE-A Performance
In cellular networks, the emergence of machine communications such as connected vehicles increases the high demand of uplink transmissions, thus, degrading the quality of service per user equipment. Enforcing quality-of-service in such cellular network is challenging, as radio phenomena, as well as user (and their devices) mobility and dynamics, are uncontrolled. To solve this issue, estimating what the quality of transmissions will be in a short future for a connected user is essential. For that purpose, we argue that lower layer metrics are a key feature whose evolution can help predict the bandwidth that the considered connections can take advantage of in the following hundreds of milliseconds. The paper then describes how a 4G testbed has been deployed in order to investigate throughput prediction in uplink transmissions at a small time granularity of 100 ms. Based on lower layer metrics (physical and mac layers), the main supervised machine learning algorithms are used, such as Linear Regressor and Random Forest to predict the uplink received bandwidth in different radio phenomena environment. Hence, a deep investigation of the impact of radio issues on bandwidth prediction is conducted. Further, our evaluation shows that the prediction is highly accurate: at the time granularity of 100 ms, the average prediction error is in the range of 6% to 12% for all the scenarios we explored.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信