Traffic Matrix Prediction with Attention-based Recurrent Neural Network

Maliang Zhang, Yingpeng Sang, Weizheng Li, Chaoxin Cai, Jinghao Huang
{"title":"Traffic Matrix Prediction with Attention-based Recurrent Neural Network","authors":"Maliang Zhang, Yingpeng Sang, Weizheng Li, Chaoxin Cai, Jinghao Huang","doi":"10.1145/3512576.3512594","DOIUrl":null,"url":null,"abstract":"Traffic matrix (TM) shows the traffic volume of a network. Therefore, TM prediction is of great significance for network management. Attention mechanism has been successful in many sub-domains of machine learning, such as computer vision and natural language processing, and it performs particularly well on time series data. In this work, we first introduce attention mechanisms into the traffic matrix prediction field by proposing an attention-based deep learning model for traffic matrix prediction. This model is composed of two parts, encoder and decoder. We use a recurrent neural network (RNN) architecture as the encoder and our decoder has an attention layer and a linear layer. Attention mechanism allows the model to have better memory ability, so the model can concentrate on those important data regardless of distance. We also reduce the time consumption of our model using GPU-based parallel acceleration. Finally, we evaluate the effectiveness of our model on a real world TM dataset, and the results show our implementations on the proposed model perform better than the baseline models.","PeriodicalId":278114,"journal":{"name":"Proceedings of the 2021 9th International Conference on Information Technology: IoT and Smart City","volume":"411 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2021 9th International Conference on Information Technology: IoT and Smart City","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3512576.3512594","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Traffic matrix (TM) shows the traffic volume of a network. Therefore, TM prediction is of great significance for network management. Attention mechanism has been successful in many sub-domains of machine learning, such as computer vision and natural language processing, and it performs particularly well on time series data. In this work, we first introduce attention mechanisms into the traffic matrix prediction field by proposing an attention-based deep learning model for traffic matrix prediction. This model is composed of two parts, encoder and decoder. We use a recurrent neural network (RNN) architecture as the encoder and our decoder has an attention layer and a linear layer. Attention mechanism allows the model to have better memory ability, so the model can concentrate on those important data regardless of distance. We also reduce the time consumption of our model using GPU-based parallel acceleration. Finally, we evaluate the effectiveness of our model on a real world TM dataset, and the results show our implementations on the proposed model perform better than the baseline models.
基于注意力的递归神经网络交通矩阵预测
流量矩阵TM (Traffic matrix)表示网络的流量。因此,TM预测对网络管理具有重要意义。注意机制在机器学习的许多子领域都取得了成功,例如计算机视觉和自然语言处理,并且它在时间序列数据上表现得特别好。在这项工作中,我们首先通过提出一种基于注意力的交通矩阵预测深度学习模型,将注意力机制引入交通矩阵预测领域。该模型由编码器和解码器两部分组成。我们使用循环神经网络(RNN)架构作为编码器,我们的解码器有一个注意层和一个线性层。注意机制使模型具有更好的记忆能力,使模型能够不受距离的影响,专注于那些重要的数据。我们还使用基于gpu的并行加速减少了模型的时间消耗。最后,我们在一个真实世界的TM数据集上评估了我们的模型的有效性,结果表明我们在该模型上的实现比基线模型表现得更好。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信