{"title":"Training RNN and it’s Variants Using Sliding Window Technique","authors":"Prerit Khandelwal, Jinia Konar, Banalaxmi Brahma","doi":"10.1109/SCEECS48394.2020.93","DOIUrl":null,"url":null,"abstract":"Recurrent neural networks are a type of neural network which was developed for handling sequential data more efficiently. Unlike feedforward neural networks, RNNs can use their internal state to process input sequences. A recurrent network can be implemented in many ways like Long Short Term Memory cell (LSTM), Gated Recurrent Unit (GRU), multidimensional LSTM, bidirectional LSTM, etc. In this paper, we have implemented variants of RNN. We have trained the models using conventional training technique as well as using a sliding window training technique. Later on in this paper, we have compared these techniques based on their performances and concluded which technique and model produce the best result for different datasets.","PeriodicalId":167175,"journal":{"name":"2020 IEEE International Students' Conference on Electrical,Electronics and Computer Science (SCEECS)","volume":"118 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE International Students' Conference on Electrical,Electronics and Computer Science (SCEECS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SCEECS48394.2020.93","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Recurrent neural networks are a type of neural network which was developed for handling sequential data more efficiently. Unlike feedforward neural networks, RNNs can use their internal state to process input sequences. A recurrent network can be implemented in many ways like Long Short Term Memory cell (LSTM), Gated Recurrent Unit (GRU), multidimensional LSTM, bidirectional LSTM, etc. In this paper, we have implemented variants of RNN. We have trained the models using conventional training technique as well as using a sliding window training technique. Later on in this paper, we have compared these techniques based on their performances and concluded which technique and model produce the best result for different datasets.