Jianying Liu, Liandong Chen, Rui Shi, J. Xu, AN Liu
{"title":"Distant Supervised Relation Extraction with Hierarchical Attention Mechanism","authors":"Jianying Liu, Liandong Chen, Rui Shi, J. Xu, AN Liu","doi":"10.1145/3507971.3507980","DOIUrl":null,"url":null,"abstract":"Current distant supervised relation extraction algorithms based on Neural Networks mostly use long short-term memory networks and convolutional neural networks, which cannot capture long-distance features of sentences. This paper proposes a distant supervised relation extraction model based on hierarchical attention mechanism, which uses self-attention mechanism to calculate features between words, and sentence-level soft-attention mechanism to extract dimensionality of sentence features. Compared with the previous method, the proposed model can better capture sentence features and improve the effect of sentence relation classification. On the dataset NYT-10, compared with the PCNN_ATT algorithm, the P@100, P@200, and P@300 indicators increase by 4.8%, 4.9% and 2.3%, respectively, and the AUC indicator increases by 1.1%.","PeriodicalId":439757,"journal":{"name":"Proceedings of the 7th International Conference on Communication and Information Processing","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 7th International Conference on Communication and Information Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3507971.3507980","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Current distant supervised relation extraction algorithms based on Neural Networks mostly use long short-term memory networks and convolutional neural networks, which cannot capture long-distance features of sentences. This paper proposes a distant supervised relation extraction model based on hierarchical attention mechanism, which uses self-attention mechanism to calculate features between words, and sentence-level soft-attention mechanism to extract dimensionality of sentence features. Compared with the previous method, the proposed model can better capture sentence features and improve the effect of sentence relation classification. On the dataset NYT-10, compared with the PCNN_ATT algorithm, the P@100, P@200, and P@300 indicators increase by 4.8%, 4.9% and 2.3%, respectively, and the AUC indicator increases by 1.1%.