{"title":"基于分层范数门控递归神经网络的关系抽取远程监控","authors":"Siheng Wei","doi":"10.1109/CDS52072.2021.00022","DOIUrl":null,"url":null,"abstract":"Relation extraction is a classic task in the NLP field which aims to predict the relation between two entities in given sentences. Convolutional neural network (CNN) is one of the typical neural network structures applied to this task. However, the existing CNN model used for extraction is not able to capture the time information in sentences which leads a great contribution to predict the right directionality between the two entities. Therefore, I propose a new gated recurrent neural networks with layer normalization (LNGRU) to obtain the background information of the future and the past in sentences. Experiments demonstrate that my model is effective and superior to several comparable baseline models.","PeriodicalId":380426,"journal":{"name":"2021 2nd International Conference on Computing and Data Science (CDS)","volume":"37 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Distantly Supervision for Relation Extraction via LayerNorm Gated Recurrent Neural Networks\",\"authors\":\"Siheng Wei\",\"doi\":\"10.1109/CDS52072.2021.00022\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Relation extraction is a classic task in the NLP field which aims to predict the relation between two entities in given sentences. Convolutional neural network (CNN) is one of the typical neural network structures applied to this task. However, the existing CNN model used for extraction is not able to capture the time information in sentences which leads a great contribution to predict the right directionality between the two entities. Therefore, I propose a new gated recurrent neural networks with layer normalization (LNGRU) to obtain the background information of the future and the past in sentences. Experiments demonstrate that my model is effective and superior to several comparable baseline models.\",\"PeriodicalId\":380426,\"journal\":{\"name\":\"2021 2nd International Conference on Computing and Data Science (CDS)\",\"volume\":\"37 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 2nd International Conference on Computing and Data Science (CDS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CDS52072.2021.00022\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 2nd International Conference on Computing and Data Science (CDS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CDS52072.2021.00022","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Distantly Supervision for Relation Extraction via LayerNorm Gated Recurrent Neural Networks
Relation extraction is a classic task in the NLP field which aims to predict the relation between two entities in given sentences. Convolutional neural network (CNN) is one of the typical neural network structures applied to this task. However, the existing CNN model used for extraction is not able to capture the time information in sentences which leads a great contribution to predict the right directionality between the two entities. Therefore, I propose a new gated recurrent neural networks with layer normalization (LNGRU) to obtain the background information of the future and the past in sentences. Experiments demonstrate that my model is effective and superior to several comparable baseline models.