{"title":"基于层次关注网络和BERT的零概率多语言情感分析","authors":"A. Sarkar, Sujeeth Reddy, Raghu Sesha Iyengar","doi":"10.1145/3342827.3342850","DOIUrl":null,"url":null,"abstract":"Sentiment analysis is considered an important downstream task in language modelling. We propose Hierarchical Attentive Network using BERT for document sentiment classification. We further showed that importing representation from Multiplicative LSTM model in our architecture results in faster convergence. We then propose a method to build a sentiment classifier for a language in which we have no labelled sentiment data. We exploit the possible semantic invariance across languages in the context of sentiment to achieve this.","PeriodicalId":254461,"journal":{"name":"Proceedings of the 2019 3rd International Conference on Natural Language Processing and Information Retrieval","volume":"29 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"Zero-Shot Multilingual Sentiment Analysis using Hierarchical Attentive Network and BERT\",\"authors\":\"A. Sarkar, Sujeeth Reddy, Raghu Sesha Iyengar\",\"doi\":\"10.1145/3342827.3342850\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Sentiment analysis is considered an important downstream task in language modelling. We propose Hierarchical Attentive Network using BERT for document sentiment classification. We further showed that importing representation from Multiplicative LSTM model in our architecture results in faster convergence. We then propose a method to build a sentiment classifier for a language in which we have no labelled sentiment data. We exploit the possible semantic invariance across languages in the context of sentiment to achieve this.\",\"PeriodicalId\":254461,\"journal\":{\"name\":\"Proceedings of the 2019 3rd International Conference on Natural Language Processing and Information Retrieval\",\"volume\":\"29 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-06-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2019 3rd International Conference on Natural Language Processing and Information Retrieval\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3342827.3342850\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2019 3rd International Conference on Natural Language Processing and Information Retrieval","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3342827.3342850","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Zero-Shot Multilingual Sentiment Analysis using Hierarchical Attentive Network and BERT
Sentiment analysis is considered an important downstream task in language modelling. We propose Hierarchical Attentive Network using BERT for document sentiment classification. We further showed that importing representation from Multiplicative LSTM model in our architecture results in faster convergence. We then propose a method to build a sentiment classifier for a language in which we have no labelled sentiment data. We exploit the possible semantic invariance across languages in the context of sentiment to achieve this.