IMPROVING TRAFFIC DENSITY PREDICTION USING LSTM WITH PARAMETRIC ReLU (PReLU) ACTIVATION

Nur Alamsyah, Titan Parama Yoga, Budiman Budiman
{"title":"IMPROVING TRAFFIC DENSITY PREDICTION USING LSTM WITH PARAMETRIC ReLU (PReLU) ACTIVATION","authors":"Nur Alamsyah, Titan Parama Yoga, Budiman Budiman","doi":"10.33480/jitk.v9i2.5046","DOIUrl":null,"url":null,"abstract":"In the presence of complex traffic flow patterns, this research responds to the challenge by proposing the application of the Long Short-Term Memory (LSTM) model and comparing four different activation functions, namely tanh, ReLU, sigmoid, and PReLU. This research aims to improve the accuracy of traffic flow prediction through LSTM model by finding the best activation function among tanh, relu, sigmoid, and PReLU. The method used starts from the collection of traffic flow datasets covering the period 2015-2017 used to train and evaluate the LSTM model with the four activation functions. Tests were conducted by observing the Train Mean Squared Error (MSE) and Validation MSE. The experimental results show that PReLU provides the best results with a Train MSE of 0.000505 and Validation MSE of 0.000755. Although tanh, ReLU, and sigmoid provided competitive results, PReLU stood out as the optimal choice to improve the adaptability of the model to complex traffic flow patterns.","PeriodicalId":475197,"journal":{"name":"JITK (Jurnal Ilmu Pengetahuan dan Teknologi Komputer)","volume":"63 20","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"JITK (Jurnal Ilmu Pengetahuan dan Teknologi Komputer)","FirstCategoryId":"0","ListUrlMain":"https://doi.org/10.33480/jitk.v9i2.5046","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

In the presence of complex traffic flow patterns, this research responds to the challenge by proposing the application of the Long Short-Term Memory (LSTM) model and comparing four different activation functions, namely tanh, ReLU, sigmoid, and PReLU. This research aims to improve the accuracy of traffic flow prediction through LSTM model by finding the best activation function among tanh, relu, sigmoid, and PReLU. The method used starts from the collection of traffic flow datasets covering the period 2015-2017 used to train and evaluate the LSTM model with the four activation functions. Tests were conducted by observing the Train Mean Squared Error (MSE) and Validation MSE. The experimental results show that PReLU provides the best results with a Train MSE of 0.000505 and Validation MSE of 0.000755. Although tanh, ReLU, and sigmoid provided competitive results, PReLU stood out as the optimal choice to improve the adaptability of the model to complex traffic flow patterns.
利用带有参数再路(PReLU)激活功能的 LSTM 改进交通密度预测
面对复杂的交通流模式,本研究提出应用长短期记忆(LSTM)模型,并比较了四种不同的激活函数,即 tanh、ReLU、sigmoid 和 PReLU,以应对这一挑战。本研究旨在通过 LSTM 模型,从 tanh、ReLU、sigmoid 和 PReLU 中找出最佳激活函数,从而提高交通流量预测的准确性。所使用的方法从收集 2015-2017 年期间的交通流量数据集开始,用于训练和评估具有四种激活函数的 LSTM 模型。测试通过观察训练均方误差(MSE)和验证均方误差(MSE)进行。实验结果表明,PReLU 的训练均方误差为 0.000505,验证均方误差为 0.000755,结果最佳。虽然 tanh、ReLU 和 sigmoid 提供了有竞争力的结果,但 PReLU 仍是最佳选择,可提高模型对复杂交通流模式的适应性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信