Changliang Li, Yujun Zhou, Saike He, Hailiang Wang
{"title":"基于背景知识关注的情感分析","authors":"Changliang Li, Yujun Zhou, Saike He, Hailiang Wang","doi":"10.1109/ISI.2019.8823324","DOIUrl":null,"url":null,"abstract":"Sentiment analysis, which is a fundamental research in the field of natural language processing and artificial intelligence field, has received much attention these years because of its practical applicability and the challenges. However, existing methods only focus on local text information and ignore the background knowledge (such as the director of a movie, the producer of a product). In this paper, we propose a novel LSTM with Background Knowledge Attention Model (LSTM-BKAM) for sentiment analysis. Our model incorporates background knowledge based attentions over different semantic parts of a sentence. The experiment results show that our model achieves state-of-the-art, and substantially better than other approaches.","PeriodicalId":156130,"journal":{"name":"2019 IEEE International Conference on Intelligence and Security Informatics (ISI)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Sentiment Analysis Based on Background Knowledge Attention\",\"authors\":\"Changliang Li, Yujun Zhou, Saike He, Hailiang Wang\",\"doi\":\"10.1109/ISI.2019.8823324\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Sentiment analysis, which is a fundamental research in the field of natural language processing and artificial intelligence field, has received much attention these years because of its practical applicability and the challenges. However, existing methods only focus on local text information and ignore the background knowledge (such as the director of a movie, the producer of a product). In this paper, we propose a novel LSTM with Background Knowledge Attention Model (LSTM-BKAM) for sentiment analysis. Our model incorporates background knowledge based attentions over different semantic parts of a sentence. The experiment results show that our model achieves state-of-the-art, and substantially better than other approaches.\",\"PeriodicalId\":156130,\"journal\":{\"name\":\"2019 IEEE International Conference on Intelligence and Security Informatics (ISI)\",\"volume\":\"23 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 IEEE International Conference on Intelligence and Security Informatics (ISI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISI.2019.8823324\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE International Conference on Intelligence and Security Informatics (ISI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISI.2019.8823324","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Sentiment Analysis Based on Background Knowledge Attention
Sentiment analysis, which is a fundamental research in the field of natural language processing and artificial intelligence field, has received much attention these years because of its practical applicability and the challenges. However, existing methods only focus on local text information and ignore the background knowledge (such as the director of a movie, the producer of a product). In this paper, we propose a novel LSTM with Background Knowledge Attention Model (LSTM-BKAM) for sentiment analysis. Our model incorporates background knowledge based attentions over different semantic parts of a sentence. The experiment results show that our model achieves state-of-the-art, and substantially better than other approaches.