Proceedings of the 2018 Workshop on Network Meets AI & ML最新文献

筛选
英文 中文
Tracking Groups in Mobile Network Traces 移动网络跟踪中的跟踪组
Proceedings of the 2018 Workshop on Network Meets AI & ML Pub Date : 2018-08-07 DOI: 10.1145/3229543.3229552
Kun Tu, Bruno Ribeiro, A. Swami, D. Towsley
{"title":"Tracking Groups in Mobile Network Traces","authors":"Kun Tu, Bruno Ribeiro, A. Swami, D. Towsley","doi":"10.1145/3229543.3229552","DOIUrl":"https://doi.org/10.1145/3229543.3229552","url":null,"abstract":"Detecting and tracking groups in mobility network traces is critical for developing accurate mobility models, which in turn are needed for mobile/wireless network design. One approach is to represent mobility traces as a temporal network and apply group (community) detection algorithms to it. However, observing detailed changes in a group over time requires analyzing group dynamics at small time scales and introduces two challenges: (a) group connectivity may be too sparse for group detection; and (b) tracking evolving groups and their lifetimes is difficult. We proposes a group detection framework to address these time scale challenges. For the time-dependent aspect of the groups, we propose a time series segmentation algorithm to detect their formations, dissolutions, and lifetimes. We generate synthetic datasets for mobile networks and use real-world datasets to test our method against state-of-the-art. The results show that our proposed approach achieves more accurate fine-grained group detection than competing methods.","PeriodicalId":198478,"journal":{"name":"Proceedings of the 2018 Workshop on Network Meets AI & ML","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127000417","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Deep-Q: Traffic-driven QoS Inference using Deep Generative Network Deep- q:使用深度生成网络的流量驱动的QoS推理
Proceedings of the 2018 Workshop on Network Meets AI & ML Pub Date : 2018-08-07 DOI: 10.1145/3229543.3229549
Shihan Xiao, Dongdong He, Zhibo Gong
{"title":"Deep-Q: Traffic-driven QoS Inference using Deep Generative Network","authors":"Shihan Xiao, Dongdong He, Zhibo Gong","doi":"10.1145/3229543.3229549","DOIUrl":"https://doi.org/10.1145/3229543.3229549","url":null,"abstract":"In today's IP network, it is important to provide the Quality of Service (QoS) guarantee for network services. However, in real networks with highly dynamic traffic demands, it is difficult to build an accurate QoS model even with a high cost of human expert analysis. In this paper, we present Deep-Q, a data-driven system to learn the QoS model directly from traffic data without human analysis. This function is achieved by utilizing the power of state-of-the-art deep generative networks in the deep learning area. Deep-Q provides a novel inference structure of a variational auto-encoder (VAE) enhanced by the long short-term memory (LSTM). A specially-designed module named Cinfer-loss is further applied to improve the QoS inference accuracy. By training with real traffic data, Deep-Q can infer a variety of QoS metrics over different networks given traffic conditions in real-time. We build testbeds for both the data center network and overlay IP network. Extensive experiments with 5.7TB traffic traces demonstrate that Deep-Q can achieve on average 3x higher inference accuracy than traditional queuing-theory-based solution in real networks while keeping inference time within 100ms.","PeriodicalId":198478,"journal":{"name":"Proceedings of the 2018 Workshop on Network Meets AI & ML","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127395658","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 46
Adaptive Multiple Non-negative Matrix Factorization for Temporal Link Prediction in Dynamic Networks 动态网络中时间链路预测的自适应多重非负矩阵分解
Proceedings of the 2018 Workshop on Network Meets AI & ML Pub Date : 2018-08-07 DOI: 10.1145/3229543.3229546
Kai Lei, Meng Qin, B. Bai, Gong Zhang
{"title":"Adaptive Multiple Non-negative Matrix Factorization for Temporal Link Prediction in Dynamic Networks","authors":"Kai Lei, Meng Qin, B. Bai, Gong Zhang","doi":"10.1145/3229543.3229546","DOIUrl":"https://doi.org/10.1145/3229543.3229546","url":null,"abstract":"The prediction of mobility, topology and traffic is an effective technique to improve the performance of various network systems, which can be generally represented as the temporal link prediction problem. In this paper, we propose a novel adaptive multiple non-negative matrix factorization (AM-NMF) method from the view of network embedding to cope with such problem. Under the framework of non-negative matrix factorization (NMF), the proposed method embeds the dynamic network into a low-dimensional hidden space, where the characteristics of different network snapshots are comprehensively preserved. Especially, our new method can effectively incorporate the hidden information of different time slices, because we introduce a novel adaptive parameter to automatically adjust the relative contribution of different terms in the uniform model. Accordingly, the prediction result of future network topology can be generated by conducting the inverse process of NMF form the shared hidden space. Moreover, we also derive the corresponding solving strategy whose convergence can be ensured. As an illustration, the new model will be applied to various network datasets such as human mobility networks, vehicle mobility networks, wireless mesh networks and data center networks. Experimental results show that our method outperforms some other state-of-the-art methods for the temporal link prediction of both unweighted and weighted networks.","PeriodicalId":198478,"journal":{"name":"Proceedings of the 2018 Workshop on Network Meets AI & ML","volume":"263 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133717165","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 17
IFS-RL: An Intelligent Forwarding Strategy Based on Reinforcement Learning in Named-Data Networking 命名数据网络中基于强化学习的智能转发策略
Proceedings of the 2018 Workshop on Network Meets AI & ML Pub Date : 2018-08-07 DOI: 10.1145/3229543.3229547
Yi Zhang, B. Bai, Kuai Xu, Kai Lei
{"title":"IFS-RL: An Intelligent Forwarding Strategy Based on Reinforcement Learning in Named-Data Networking","authors":"Yi Zhang, B. Bai, Kuai Xu, Kai Lei","doi":"10.1145/3229543.3229547","DOIUrl":"https://doi.org/10.1145/3229543.3229547","url":null,"abstract":"Named-Data Networking (NDN) is a new communication paradigm where network primitives are based on named-data rather than host identifiers. Compared with IP, NDN has a unique feature that forwarding plane enables each router to select the next forwarding hop independently without relying on routing. Therefore, forwarding strategies play a significant role for adaptive and efficient data transmission in NDN. Most of the existing forwarding strategies use fixed control rules based on simplified or inaccurate models of the deployment environment. As a result, existing schemes inevitably fail to achieve optimal performance across a broad set of network conditions and application demands. In this paper, We propose IFS-RL, an intelligent forwarding strategy based on reinforcement learning. IFS-RL trains a neural network model which chooses appropriate interfaces for the forwarding of Interest based on observations collected by routing node. Not relying on pre-programmed models, IFS-RL learns to make decisions solely through observations of the resulting performance of past decisions. Therefore, IFS-RL can implement intelligent forwrarding which adapt to a wide range of network conditions. Besides, we also researches the learning granularity and the enhancement for network topology change. We compare IFS-RL to state-of-the-art forwarding strategies in ndnSIM. Experimental results show that IFS-RL can achieve higher throughput and lower packet drop rates.","PeriodicalId":198478,"journal":{"name":"Proceedings of the 2018 Workshop on Network Meets AI & ML","volume":"157 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121418413","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
Empowering Sketches with Machine Learning for Network Measurements 授权草图与机器学习网络测量
Proceedings of the 2018 Workshop on Network Meets AI & ML Pub Date : 2018-08-07 DOI: 10.1145/3229543.3229545
Tong Yang, Lun Wang, Yulong Shen, Muhammad Shahzad, Qun Huang, Xiaohong Jiang, Kun Tan, Xiaoming Li
{"title":"Empowering Sketches with Machine Learning for Network Measurements","authors":"Tong Yang, Lun Wang, Yulong Shen, Muhammad Shahzad, Qun Huang, Xiaohong Jiang, Kun Tan, Xiaoming Li","doi":"10.1145/3229543.3229545","DOIUrl":"https://doi.org/10.1145/3229543.3229545","url":null,"abstract":"Network monitoring and management require accurate statistics of a variety of flow-level metrics, such as flow sizes, top-k flows, and number of flows. Arguably, the most commonly used data structure to record and measure these metrics is the sketch. While a significant amount of work has already been done on sketching techniques, there is still a lot of room for improvement because the accuracy of existing sketches depends a lot on the nature of network traffic and varies significantly as the network traffic characteristics change. In this paper, we propose the idea of employing machine learning to reduce this dependence of the accuracy of sketches on network traffic characteristics and present a generalized machine learning framework that increases the accuracy of sketches significantly. We further present three case studies, where we applied our framework on sketches for measuring three well-known flow-level network metrics. Experimental results show that machine learning helps decrease the error rates of existing sketches by up to 202 times.","PeriodicalId":198478,"journal":{"name":"Proceedings of the 2018 Workshop on Network Meets AI & ML","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128503956","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 24
Improving TCP Congestion Control with Machine Intelligence 利用机器智能改进TCP拥塞控制
Proceedings of the 2018 Workshop on Network Meets AI & ML Pub Date : 2018-08-07 DOI: 10.1145/3229543.3229550
Yiming Kong, H. Zang, Xiaoli Ma
{"title":"Improving TCP Congestion Control with Machine Intelligence","authors":"Yiming Kong, H. Zang, Xiaoli Ma","doi":"10.1145/3229543.3229550","DOIUrl":"https://doi.org/10.1145/3229543.3229550","url":null,"abstract":"In a TCP/IP network, a key to ensure efficient and fair sharing of network resources among its users is the TCP congestion control (CC) scheme. Previously, the design of TCP CC schemes is based on hard-wiring of predefined actions to specific feedback signals from the network. However, as networks become more complex and dynamic, it becomes harder to design the optimal feedback-action mapping. Recently, learning-based TCP CC schemes have attracted much attention due to their strong capabilities to learn the actions from interacting with the network. In this paper, we design two learning-based TCP CC schemes for wired networks with under-buffered bottleneck links, a loss predictor (LP) based TCP CC (LP-TCP), and a reinforcement learning (RL) based TCP CC (RL-TCP). We implement both LP-TCP and RL-TCP in NS2. Compared to the existing NewReno and Q-learning based TCP, LP-TCP and RL-TCP both achieve a better tradeoff between throughput and delay, under various simulated network scenarios.","PeriodicalId":198478,"journal":{"name":"Proceedings of the 2018 Workshop on Network Meets AI & ML","volume":"176 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124351161","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 61
HiPS: Hierarchical Parameter Synchronization in Large-Scale Distributed Machine Learning HiPS:大规模分布式机器学习中的分层参数同步
Proceedings of the 2018 Workshop on Network Meets AI & ML Pub Date : 2018-08-07 DOI: 10.1145/3229543.3229544
Jinkun Geng, Dan Li, Yang Cheng, Shuai Wang, Junfeng Li
{"title":"HiPS: Hierarchical Parameter Synchronization in Large-Scale Distributed Machine Learning","authors":"Jinkun Geng, Dan Li, Yang Cheng, Shuai Wang, Junfeng Li","doi":"10.1145/3229543.3229544","DOIUrl":"https://doi.org/10.1145/3229543.3229544","url":null,"abstract":"In large-scale distributed machine learning (DML) system, parameter (gradient) synchronization among machines plays an important role in improving the DML performance. State-of-the-art DML synchronization algorithms, either the parameter server (PS) based algorithm or the ring allreduce algorithm, work in a flat way and suffer when the network size is large. In this work, we propose HiPS, a hierarchical parameter (gradient) synchronization framework in large-scale DML. In HiPS, server-centric network topology is used to better embrace RDMA/RoCE transport between machines, and the parameters (gradients) are synchronized in a hierarchical and hybrid way. Our evaluation in BCube and Torus network demonstrates that HiPS can better match server-centric networks. Compared with the flat algorithms (PS-based and ring-based), HiPS reduces the synchronization time by 73% and 75% respectively.","PeriodicalId":198478,"journal":{"name":"Proceedings of the 2018 Workshop on Network Meets AI & ML","volume":"2017 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122752664","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 25
Assessing the Impact of Network Events with User Feedback 用用户反馈评估网络事件的影响
Proceedings of the 2018 Workshop on Network Meets AI & ML Pub Date : 2018-08-07 DOI: 10.1145/3229543.3229553
Shobha Venkataraman, Jia Wang
{"title":"Assessing the Impact of Network Events with User Feedback","authors":"Shobha Venkataraman, Jia Wang","doi":"10.1145/3229543.3229553","DOIUrl":"https://doi.org/10.1145/3229543.3229553","url":null,"abstract":"User feedback data, generated when users call customer care agents with problems, is a valuable source of data for understanding network problems from users' perspectives. However, this data is extremely noisy. In this paper, we design a framework, LOTUS, to assess the user impact of network events from the user feedback, through a novel algorithmic composition of co-training and spatial scan statistics. Through experimental analysis on synthetic and real data, we show the accuracy and practical nature of LOTUS.","PeriodicalId":198478,"journal":{"name":"Proceedings of the 2018 Workshop on Network Meets AI & ML","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126978099","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Efficient Distribution-Derived Features for High-Speed Encrypted Flow Classification 高速加密流分类的高效分布衍生特征
Proceedings of the 2018 Workshop on Network Meets AI & ML Pub Date : 2018-08-07 DOI: 10.1145/3229543.3229548
Johan Garcia, Topi Korhonen
{"title":"Efficient Distribution-Derived Features for High-Speed Encrypted Flow Classification","authors":"Johan Garcia, Topi Korhonen","doi":"10.1145/3229543.3229548","DOIUrl":"https://doi.org/10.1145/3229543.3229548","url":null,"abstract":"Flow classification is an important tool to enable efficient network resource usage, support traffic engineering, and aid QoS mechanisms. As traffic is increasingly becoming encrypted by default, flow classification is turning towards the use of machine learning methods employing features that are also available for encrypted traffic. In this work we evaluate flow features that capture the distributional properties of in-flow per-packet metrics such as packet size and inter-arrival time. The characteristics of such distributions are often captured with general statistical measures such as standard deviation, variance, etc. We instead propose a Kolmogorov-Smirnov discretization (KSD) algorithm to perform histogram bin construction based on the distributional properties observed in the data. This allows for a richer, histogram based, representation which also requires less resources for feature computation than higher order statistical moments. A comprehensive evaluation using synthetic data from Gaussian and Beta mixtures show that the KSD approach provides Jensen-Shannon distance results surpassing those of uniform binning and probabilistic binning. An empirical evaluation using live traffic traces from a cellular network further shows that when coupled with a random forest classifier the KSD-constructed features improve classification performance compared to general statistical features based on higher order moments, or alternative bin placement approaches.","PeriodicalId":198478,"journal":{"name":"Proceedings of the 2018 Workshop on Network Meets AI & ML","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122404260","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
DeepCache: A Deep Learning Based Framework For Content Caching DeepCache:基于深度学习的内容缓存框架
Proceedings of the 2018 Workshop on Network Meets AI & ML Pub Date : 2018-08-07 DOI: 10.1145/3229543.3229555
Arvind Narayanan, Saurabh Verma, Eman Ramadan, Pariya Babaie, Zhi-Li Zhang
{"title":"DeepCache: A Deep Learning Based Framework For Content Caching","authors":"Arvind Narayanan, Saurabh Verma, Eman Ramadan, Pariya Babaie, Zhi-Li Zhang","doi":"10.1145/3229543.3229555","DOIUrl":"https://doi.org/10.1145/3229543.3229555","url":null,"abstract":"In this paper, we present DEEPCACHE a novel Framework for content caching, which can significantly boost cache performance. Our Framework is based on powerful deep recurrent neural network models. It comprises of two main components: i) Object Characteristics Predictor, which builds upon deep LSTM Encoder-Decoder model to predict the future characteristics of an object (such as object popularity) -- to the best of our knowledge, we are the first to propose LSTM Encoder-Decoder model for content caching; ii) a caching policy component, which accounts for predicted information of objects to make smart caching decisions. In our thorough experiments, we show that applying DEEPCACHE Framework to existing cache policies, such as LRU and k-LRU, significantly boosts the number of cache hits.","PeriodicalId":198478,"journal":{"name":"Proceedings of the 2018 Workshop on Network Meets AI & ML","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114072295","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 115
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信