{"title":"IMPROVING TRAFFIC DENSITY PREDICTION USING LSTM WITH PARAMETRIC ReLU (PReLU) ACTIVATION","authors":"Nur Alamsyah, Titan Parama Yoga, Budiman Budiman","doi":"10.33480/jitk.v9i2.5046","DOIUrl":null,"url":null,"abstract":"In the presence of complex traffic flow patterns, this research responds to the challenge by proposing the application of the Long Short-Term Memory (LSTM) model and comparing four different activation functions, namely tanh, ReLU, sigmoid, and PReLU. This research aims to improve the accuracy of traffic flow prediction through LSTM model by finding the best activation function among tanh, relu, sigmoid, and PReLU. The method used starts from the collection of traffic flow datasets covering the period 2015-2017 used to train and evaluate the LSTM model with the four activation functions. Tests were conducted by observing the Train Mean Squared Error (MSE) and Validation MSE. The experimental results show that PReLU provides the best results with a Train MSE of 0.000505 and Validation MSE of 0.000755. Although tanh, ReLU, and sigmoid provided competitive results, PReLU stood out as the optimal choice to improve the adaptability of the model to complex traffic flow patterns.","PeriodicalId":475197,"journal":{"name":"JITK (Jurnal Ilmu Pengetahuan dan Teknologi Komputer)","volume":"63 20","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"JITK (Jurnal Ilmu Pengetahuan dan Teknologi Komputer)","FirstCategoryId":"0","ListUrlMain":"https://doi.org/10.33480/jitk.v9i2.5046","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In the presence of complex traffic flow patterns, this research responds to the challenge by proposing the application of the Long Short-Term Memory (LSTM) model and comparing four different activation functions, namely tanh, ReLU, sigmoid, and PReLU. This research aims to improve the accuracy of traffic flow prediction through LSTM model by finding the best activation function among tanh, relu, sigmoid, and PReLU. The method used starts from the collection of traffic flow datasets covering the period 2015-2017 used to train and evaluate the LSTM model with the four activation functions. Tests were conducted by observing the Train Mean Squared Error (MSE) and Validation MSE. The experimental results show that PReLU provides the best results with a Train MSE of 0.000505 and Validation MSE of 0.000755. Although tanh, ReLU, and sigmoid provided competitive results, PReLU stood out as the optimal choice to improve the adaptability of the model to complex traffic flow patterns.