{"title":"基于关注机制的多窗口核cnn关系分类","authors":"Xiao Huang, J. Lin, Wei Teng, Yanxiang Bao","doi":"10.1109/IAEAC47372.2019.8997966","DOIUrl":null,"url":null,"abstract":"Relation classification is an important ingredient task in the construction of knowledge graph, question answering system and numerous other natural language processing (NLP) tasks. With the application of deep neural networks (DNNs) such as recurrent neural networks (RNNs) and convolutional neural networks (CNNs), relation classification task has achieved satisfactory results. However, many proposed models can not take well advantages of multiple window sizes for filters in CNNs and finally hurt the performance of this task. Moreover, unlike public and general dataset that has a large quantity of instances from natural languages or daily conversations, the performances of many deep neural networks with high complexity are not well enough for a small corpus in specific fields. To work out these problems, we propose a novel CNN model with attention mechanism for multi-window-sized kernels to capture the most important information and test our system not only on a general dataset of SemEval 2010 but also on a small dataset built from Chinese fundamentals of electric circuits textbook artificially. The experimental results show that our system outperforms the baseline systems for the SemEval 2010 relation classification task and validate the effectiveness of CNN on the specific Chinese small corpus relation classification task.","PeriodicalId":164163,"journal":{"name":"2019 IEEE 4th Advanced Information Technology, Electronic and Automation Control Conference (IAEAC)","volume":"63 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Relation Classification via CNNs with Attention Mechanism for Multi-Window-Sized Kernels\",\"authors\":\"Xiao Huang, J. Lin, Wei Teng, Yanxiang Bao\",\"doi\":\"10.1109/IAEAC47372.2019.8997966\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Relation classification is an important ingredient task in the construction of knowledge graph, question answering system and numerous other natural language processing (NLP) tasks. With the application of deep neural networks (DNNs) such as recurrent neural networks (RNNs) and convolutional neural networks (CNNs), relation classification task has achieved satisfactory results. However, many proposed models can not take well advantages of multiple window sizes for filters in CNNs and finally hurt the performance of this task. Moreover, unlike public and general dataset that has a large quantity of instances from natural languages or daily conversations, the performances of many deep neural networks with high complexity are not well enough for a small corpus in specific fields. To work out these problems, we propose a novel CNN model with attention mechanism for multi-window-sized kernels to capture the most important information and test our system not only on a general dataset of SemEval 2010 but also on a small dataset built from Chinese fundamentals of electric circuits textbook artificially. The experimental results show that our system outperforms the baseline systems for the SemEval 2010 relation classification task and validate the effectiveness of CNN on the specific Chinese small corpus relation classification task.\",\"PeriodicalId\":164163,\"journal\":{\"name\":\"2019 IEEE 4th Advanced Information Technology, Electronic and Automation Control Conference (IAEAC)\",\"volume\":\"63 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 IEEE 4th Advanced Information Technology, Electronic and Automation Control Conference (IAEAC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IAEAC47372.2019.8997966\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE 4th Advanced Information Technology, Electronic and Automation Control Conference (IAEAC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IAEAC47372.2019.8997966","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Relation Classification via CNNs with Attention Mechanism for Multi-Window-Sized Kernels
Relation classification is an important ingredient task in the construction of knowledge graph, question answering system and numerous other natural language processing (NLP) tasks. With the application of deep neural networks (DNNs) such as recurrent neural networks (RNNs) and convolutional neural networks (CNNs), relation classification task has achieved satisfactory results. However, many proposed models can not take well advantages of multiple window sizes for filters in CNNs and finally hurt the performance of this task. Moreover, unlike public and general dataset that has a large quantity of instances from natural languages or daily conversations, the performances of many deep neural networks with high complexity are not well enough for a small corpus in specific fields. To work out these problems, we propose a novel CNN model with attention mechanism for multi-window-sized kernels to capture the most important information and test our system not only on a general dataset of SemEval 2010 but also on a small dataset built from Chinese fundamentals of electric circuits textbook artificially. The experimental results show that our system outperforms the baseline systems for the SemEval 2010 relation classification task and validate the effectiveness of CNN on the specific Chinese small corpus relation classification task.