{"title":"Attention-Based Bidirectional Long Short Term Memory Networks Combine with Phrase Convolution Layer for Relation Extraction","authors":"Chuangmin Xie, Degang Chen, Hao Shi, Mingyu Fan","doi":"10.1109/SLAAI-ICAI54477.2021.9664707","DOIUrl":null,"url":null,"abstract":"Relation Extraction (RE) is one of the most important tasks in Natural Language Processing (NLP). In recent years, with the development of deep learning, a variety of deep neural networks, such as Convolution Neural Network (CNN), Recurrent Neural Network (RNN) and Long Short Term Memory Network (LSTM), have been used in relation extraction and made significant progress. Moreover, LSTM has become the mainstream model in the field of NLP due to its better long term dependencies capture capability than CNN. However, the ability of LSTM to capture long term dependencies is still limited. In order to solve this problem, we propose a phrase convolution structure. The structure can extract the phrase-level features of the sentence, and the sentence-level features can be further extracted after the features are input into LSTM. We believe that this actually enhances the ability of LSTM to capture long term dependencies. Our experiments on SemEva1-2010 Task 8 dataset show that the performance of our model is better than most existing models.","PeriodicalId":252006,"journal":{"name":"2021 5th SLAAI International Conference on Artificial Intelligence (SLAAI-ICAI)","volume":"129 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 5th SLAAI International Conference on Artificial Intelligence (SLAAI-ICAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SLAAI-ICAI54477.2021.9664707","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
Relation Extraction (RE) is one of the most important tasks in Natural Language Processing (NLP). In recent years, with the development of deep learning, a variety of deep neural networks, such as Convolution Neural Network (CNN), Recurrent Neural Network (RNN) and Long Short Term Memory Network (LSTM), have been used in relation extraction and made significant progress. Moreover, LSTM has become the mainstream model in the field of NLP due to its better long term dependencies capture capability than CNN. However, the ability of LSTM to capture long term dependencies is still limited. In order to solve this problem, we propose a phrase convolution structure. The structure can extract the phrase-level features of the sentence, and the sentence-level features can be further extracted after the features are input into LSTM. We believe that this actually enhances the ability of LSTM to capture long term dependencies. Our experiments on SemEva1-2010 Task 8 dataset show that the performance of our model is better than most existing models.