{"title":"Introduction to the Special Section on Learning-Based Modeling, Management, and Control for Computer and Communication Networks","authors":"Jian Tang, Rong L. Zheng, Ö. Akan, Weiyi Zhang","doi":"10.1109/tnse.2019.2961103","DOIUrl":null,"url":null,"abstract":"COMPUTER and communication networks are becoming larger and more complicated, generating a huge amount of runtime statistics data (such as traffic load, resource usages, etc.) every second. Meanwhile, emerging machine learning models and techniques, such as active learning, Deep Neural Networks (DNNs) and Deep Reinforcement Learning (DRL), have been shown to dramatically improve the state-of-the-art of many applications, including video/image processing, natural language processing, game playing, etc. This special issue aims to exploit how these emerging and powerful techniques can be leveraged to grasp the exciting opportunities provided by pervasive availability of voluminous data to model, manage and control computer and communication networks. We appreciate contributions to this special section and the valuable and extensive efforts of the reviewers. The topics of this special section range from modeling, analysis, demonstration of various networks with emerging machine learning techniques. A brief review follows: In “Mitigating bottlenecks in wide area data analytics via machine learning,” Wang et al. present a system framework that minimizes query response times by detecting and mitigating bottlenecks at runtime. In “Deep learning meets wireless network optimization: Identify critical links,” Liu et al. investigate how to exploit deep learning for significant performance gain in wireless network optimization by identifying the possibility that a smaller-sized problem can be solved while sharing equally optimal solutions with the original problem. For the first time, in “Channel selective activity recognition with WiFi: A deep learning approach exploring wideband information,” Wang et al. explore wideband WiFi information with advanced deep learning towards more accurate and robust activity recognition. The key innovation is to actively select available WiFi channels with good quality and seamlessly hop among adjacent channels to form an extended channel. In “Caching for mobile social networks with deep learning: Twitter analysis for 2016 U.S. election,” Tsai et al. discuss the problem of context-aware data caching in the heterogeneous small cell networks to reduce the service delay and how the device-to-device and device-to-infrastructure improve the system social welfare. In simulation, such scheme was shown to efficiently reduce the service latency during 2016 U.S. presidential election where mobile users were urgent to request the election information through wireless networks. In “Renewable energy-aware big data analytics in geo-distributed data centers with reinforcement learning,” Xu et al. investigate the cost minimization problem of big data analytics on geodistributed data centers connected to renewable energy sources with unpredictable capacity. Dai et al. propose in “Hierarchical and hybrid: Mobility-compatible database-assisted framework for dynamic spectrum access” a hierarchical framework to enable the hybrid spectrum access scheme. By building relatively reliable clusters, and connecting cluster heads to the spectrum database, nodes with poor or no connections to the database can also benefit from spectrum maps. Finally, “Channel state information prediction for 5G wireless communications: A deep learning approach” by Luo et al. studies one of the most fundamental problems in wireless communication systems, Channel state information (CSI) estimation. The authors propose an efficient online CSI prediction scheme for predicting CSI from historical data in 5G wireless communication systems.","PeriodicalId":407574,"journal":{"name":"IEEE Trans. Netw. Sci. Eng.","volume":"36 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Trans. Netw. Sci. Eng.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/tnse.2019.2961103","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
COMPUTER and communication networks are becoming larger and more complicated, generating a huge amount of runtime statistics data (such as traffic load, resource usages, etc.) every second. Meanwhile, emerging machine learning models and techniques, such as active learning, Deep Neural Networks (DNNs) and Deep Reinforcement Learning (DRL), have been shown to dramatically improve the state-of-the-art of many applications, including video/image processing, natural language processing, game playing, etc. This special issue aims to exploit how these emerging and powerful techniques can be leveraged to grasp the exciting opportunities provided by pervasive availability of voluminous data to model, manage and control computer and communication networks. We appreciate contributions to this special section and the valuable and extensive efforts of the reviewers. The topics of this special section range from modeling, analysis, demonstration of various networks with emerging machine learning techniques. A brief review follows: In “Mitigating bottlenecks in wide area data analytics via machine learning,” Wang et al. present a system framework that minimizes query response times by detecting and mitigating bottlenecks at runtime. In “Deep learning meets wireless network optimization: Identify critical links,” Liu et al. investigate how to exploit deep learning for significant performance gain in wireless network optimization by identifying the possibility that a smaller-sized problem can be solved while sharing equally optimal solutions with the original problem. For the first time, in “Channel selective activity recognition with WiFi: A deep learning approach exploring wideband information,” Wang et al. explore wideband WiFi information with advanced deep learning towards more accurate and robust activity recognition. The key innovation is to actively select available WiFi channels with good quality and seamlessly hop among adjacent channels to form an extended channel. In “Caching for mobile social networks with deep learning: Twitter analysis for 2016 U.S. election,” Tsai et al. discuss the problem of context-aware data caching in the heterogeneous small cell networks to reduce the service delay and how the device-to-device and device-to-infrastructure improve the system social welfare. In simulation, such scheme was shown to efficiently reduce the service latency during 2016 U.S. presidential election where mobile users were urgent to request the election information through wireless networks. In “Renewable energy-aware big data analytics in geo-distributed data centers with reinforcement learning,” Xu et al. investigate the cost minimization problem of big data analytics on geodistributed data centers connected to renewable energy sources with unpredictable capacity. Dai et al. propose in “Hierarchical and hybrid: Mobility-compatible database-assisted framework for dynamic spectrum access” a hierarchical framework to enable the hybrid spectrum access scheme. By building relatively reliable clusters, and connecting cluster heads to the spectrum database, nodes with poor or no connections to the database can also benefit from spectrum maps. Finally, “Channel state information prediction for 5G wireless communications: A deep learning approach” by Luo et al. studies one of the most fundamental problems in wireless communication systems, Channel state information (CSI) estimation. The authors propose an efficient online CSI prediction scheme for predicting CSI from historical data in 5G wireless communication systems.
计算机和通信网络变得越来越大,越来越复杂,每秒都会产生大量的运行时统计数据(如流量负载、资源使用等)。与此同时,新兴的机器学习模型和技术,如主动学习、深度神经网络(dnn)和深度强化学习(DRL),已经被证明可以显著提高许多应用的最新水平,包括视频/图像处理、自然语言处理、游戏等。本期特刊旨在探讨如何利用这些新兴而强大的技术来把握大量数据的普遍可用性所提供的令人兴奋的机会,以建模、管理和控制计算机和通信网络。我们感谢对这一特别部分的贡献以及审稿人的宝贵和广泛的努力。这个特殊部分的主题包括建模、分析、演示各种网络与新兴的机器学习技术。简要回顾如下:在“通过机器学习缓解广域数据分析中的瓶颈”一文中,Wang等人提出了一个系统框架,通过在运行时检测和缓解瓶颈来最大限度地减少查询响应时间。在“深度学习遇到无线网络优化:识别关键环节”一文中,Liu等人研究了如何利用深度学习在无线网络优化中获得显著的性能提升,方法是在与原始问题共享同等最优解的同时,识别可以解决较小规模问题的可能性。Wang等人首次在“Channel selective activity recognition with WiFi: A deep learning approach exploring宽带信息”一文中,利用先进的深度学习对宽带WiFi信息进行探索,以实现更准确、鲁棒的活动识别。关键创新是主动选择可用的优质WiFi信道,在相邻信道之间无缝跳转,形成扩展信道。在“基于深度学习的移动社交网络缓存:2016年美国大选的Twitter分析”中,Tsai等人讨论了异构小蜂窝网络中上下文感知数据缓存的问题,以减少服务延迟,以及设备到设备和设备到基础设施如何提高系统的社会福利。仿真结果表明,该方案能够有效降低2016年美国总统大选期间移动用户迫切需要通过无线网络获取选举信息的服务延迟。在“基于强化学习的地理分布式数据中心的可再生能源感知大数据分析”一文中,Xu等人研究了连接容量不可预测的可再生能源的地理分布式数据中心的大数据分析的成本最小化问题。Dai等人在“Hierarchical and hybrid: Mobility-compatible database-assisted framework for dynamic spectrum access”一文中提出了一种分层框架来实现混合频谱接入方案。通过构建相对可靠的集群,并将集群头连接到频谱数据库,与数据库连接较差或没有连接的节点也可以从频谱映射中受益。最后,Luo等人的“5G无线通信的信道状态信息预测:一种深度学习方法”研究了无线通信系统中最基本的问题之一,信道状态信息(CSI)估计。针对5G无线通信系统中历史数据的CSI预测,提出了一种有效的在线CSI预测方案。