{"title":"基于改进关注机制的联合中文实体关系提取","authors":"Hu Dingding","doi":"10.1109/ISAIEE57420.2022.00060","DOIUrl":null,"url":null,"abstract":"Entity relation extraction is one of the core sub-tasks of information extraction, and also the focus of natural language processing research.First, according to the problems of pipelined entity relation extraction, a joint method based on sequence annotation is adopted for entity relation extraction. Secondly, the knowledge-enhanced ERNIE pretraining model is used for text semantic representation. In the feature extraction module, a general attention mechanism is not effective in the small-scale data set. An improved attention mechanism and BiLSTM are proposed. Finally, a variant-loss function of circle loss is adopted for the slow model convergence problem caused by the data label imbalance problem. After experiment, it is shown that the proposed fusion model outperforms the other models, while using the variant loss function of circle loss makes the model converge faster.","PeriodicalId":345703,"journal":{"name":"2022 International Symposium on Advances in Informatics, Electronics and Education (ISAIEE)","volume":"13 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Joint Chinese entity relationship extraction based on the improved attention mechanism\",\"authors\":\"Hu Dingding\",\"doi\":\"10.1109/ISAIEE57420.2022.00060\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Entity relation extraction is one of the core sub-tasks of information extraction, and also the focus of natural language processing research.First, according to the problems of pipelined entity relation extraction, a joint method based on sequence annotation is adopted for entity relation extraction. Secondly, the knowledge-enhanced ERNIE pretraining model is used for text semantic representation. In the feature extraction module, a general attention mechanism is not effective in the small-scale data set. An improved attention mechanism and BiLSTM are proposed. Finally, a variant-loss function of circle loss is adopted for the slow model convergence problem caused by the data label imbalance problem. After experiment, it is shown that the proposed fusion model outperforms the other models, while using the variant loss function of circle loss makes the model converge faster.\",\"PeriodicalId\":345703,\"journal\":{\"name\":\"2022 International Symposium on Advances in Informatics, Electronics and Education (ISAIEE)\",\"volume\":\"13 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 International Symposium on Advances in Informatics, Electronics and Education (ISAIEE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISAIEE57420.2022.00060\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Symposium on Advances in Informatics, Electronics and Education (ISAIEE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISAIEE57420.2022.00060","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Joint Chinese entity relationship extraction based on the improved attention mechanism
Entity relation extraction is one of the core sub-tasks of information extraction, and also the focus of natural language processing research.First, according to the problems of pipelined entity relation extraction, a joint method based on sequence annotation is adopted for entity relation extraction. Secondly, the knowledge-enhanced ERNIE pretraining model is used for text semantic representation. In the feature extraction module, a general attention mechanism is not effective in the small-scale data set. An improved attention mechanism and BiLSTM are proposed. Finally, a variant-loss function of circle loss is adopted for the slow model convergence problem caused by the data label imbalance problem. After experiment, it is shown that the proposed fusion model outperforms the other models, while using the variant loss function of circle loss makes the model converge faster.