{"title":"基于领域自适应的T-LBERT跨领域情感分类","authors":"Hongye Cao, Qianru Wei, Jiangbin Zheng","doi":"10.34028/iajit/20/1/15","DOIUrl":null,"url":null,"abstract":"Cross-domain sentiment classification transfers the knowledge from the source domain to the target domain lacking supervised information for sentiment classification. Existing cross-domain sentiment classification methods establish connections by extracting domain-invariant features manually. However, these methods have poor adaptability to bridge connections across different domains and ignore important sentiment information. Hence, we propose a Topic Lite Bidirectional Encoder Representations from Transformers (T-LBERT) model with domain adaption to improve the adaptability of cross-domain sentiment classification. It combines the learning content of the source domain and the topic information of the target domain to improve the domain adaptability of the model. Due to the unbalanced distribution of information in the combined data, we apply a two-layer attention adaptive mechanism for classification. A shallow attention layer is applied to weigh the important features of the combined data. Inspired by active learning, we propose a deep domain adaption layer, which actively adjusts model parameters to balance the difference and representativeness between domains. Experimental results on Amazon review datasets demonstrate that the T-LBERT model considerably outperforms other state-of-the-art methods. T-LBERT shows stable classification performance on multiple metrics.","PeriodicalId":13624,"journal":{"name":"Int. Arab J. Inf. Technol.","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"T-LBERT with Domain Adaptation for Cross-Domain Sentiment Classification\",\"authors\":\"Hongye Cao, Qianru Wei, Jiangbin Zheng\",\"doi\":\"10.34028/iajit/20/1/15\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Cross-domain sentiment classification transfers the knowledge from the source domain to the target domain lacking supervised information for sentiment classification. Existing cross-domain sentiment classification methods establish connections by extracting domain-invariant features manually. However, these methods have poor adaptability to bridge connections across different domains and ignore important sentiment information. Hence, we propose a Topic Lite Bidirectional Encoder Representations from Transformers (T-LBERT) model with domain adaption to improve the adaptability of cross-domain sentiment classification. It combines the learning content of the source domain and the topic information of the target domain to improve the domain adaptability of the model. Due to the unbalanced distribution of information in the combined data, we apply a two-layer attention adaptive mechanism for classification. A shallow attention layer is applied to weigh the important features of the combined data. Inspired by active learning, we propose a deep domain adaption layer, which actively adjusts model parameters to balance the difference and representativeness between domains. Experimental results on Amazon review datasets demonstrate that the T-LBERT model considerably outperforms other state-of-the-art methods. T-LBERT shows stable classification performance on multiple metrics.\",\"PeriodicalId\":13624,\"journal\":{\"name\":\"Int. Arab J. Inf. Technol.\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Int. Arab J. Inf. Technol.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.34028/iajit/20/1/15\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Int. Arab J. Inf. Technol.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.34028/iajit/20/1/15","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
T-LBERT with Domain Adaptation for Cross-Domain Sentiment Classification
Cross-domain sentiment classification transfers the knowledge from the source domain to the target domain lacking supervised information for sentiment classification. Existing cross-domain sentiment classification methods establish connections by extracting domain-invariant features manually. However, these methods have poor adaptability to bridge connections across different domains and ignore important sentiment information. Hence, we propose a Topic Lite Bidirectional Encoder Representations from Transformers (T-LBERT) model with domain adaption to improve the adaptability of cross-domain sentiment classification. It combines the learning content of the source domain and the topic information of the target domain to improve the domain adaptability of the model. Due to the unbalanced distribution of information in the combined data, we apply a two-layer attention adaptive mechanism for classification. A shallow attention layer is applied to weigh the important features of the combined data. Inspired by active learning, we propose a deep domain adaption layer, which actively adjusts model parameters to balance the difference and representativeness between domains. Experimental results on Amazon review datasets demonstrate that the T-LBERT model considerably outperforms other state-of-the-art methods. T-LBERT shows stable classification performance on multiple metrics.