{"title":"Rectified Attention Gate Unit in Recurrent Neural Networks for Effective Attention Computation","authors":"Manh-Hung Ha, O. Chen","doi":"10.1109/SSP53291.2023.10207931","DOIUrl":null,"url":null,"abstract":"Recurrent Neural Networks (RNNs) have been successful in figuring out applications on time series data. Particularly, effectively capturing local features can ameliorate the performance of RNN. Accordingly, we propose a Rectified Attention Gate Unit (RAGU) which amends Gated Recurrent Unit (GRU) with two special attention mechanisms for RNNs. These two attention mechanisms are a Convolutional Attention (ConvAtt) module performing the convolutional operations on the current input and the previous hidden state to fairly establish the spatiotemporal relationship, and an Attention Module (AM) taking outputs from ConvAtt to fulfill the integrated attention computations for discovering the contextual dependency. Experimental results reveal that RNN using the proposed RAGUs has superior accuracies than RNNs using the other cell units on the HMDB51 and MNIST datasets. Therefore, RAGU proposed herein is an effective model which can bring out outstanding performance for various time series applications.","PeriodicalId":296346,"journal":{"name":"2023 IEEE Statistical Signal Processing Workshop (SSP)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE Statistical Signal Processing Workshop (SSP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SSP53291.2023.10207931","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Recurrent Neural Networks (RNNs) have been successful in figuring out applications on time series data. Particularly, effectively capturing local features can ameliorate the performance of RNN. Accordingly, we propose a Rectified Attention Gate Unit (RAGU) which amends Gated Recurrent Unit (GRU) with two special attention mechanisms for RNNs. These two attention mechanisms are a Convolutional Attention (ConvAtt) module performing the convolutional operations on the current input and the previous hidden state to fairly establish the spatiotemporal relationship, and an Attention Module (AM) taking outputs from ConvAtt to fulfill the integrated attention computations for discovering the contextual dependency. Experimental results reveal that RNN using the proposed RAGUs has superior accuracies than RNNs using the other cell units on the HMDB51 and MNIST datasets. Therefore, RAGU proposed herein is an effective model which can bring out outstanding performance for various time series applications.