{"title":"从用户信息中提取知识用于文档级情感分类","authors":"Jialing Song","doi":"10.1109/ICDEW.2019.00-15","DOIUrl":null,"url":null,"abstract":"Combining global user and product characteristics with local review information provides a powerful mechanism for predicting users' sentiment in a review document about a product on online review sites such as Amazon, Yelp and IMDB. However, the user information is not always available in the real scenario, for example, some new-registered users, or some sites allowing users' comments without logging in. To address this issue, we introduce a novel knowledge distillation (KD) learning paradigm, to transfer the user characteristics into the weights of student neural networks that just utilize product and review information. The teacher model transfers its predictive distributions of training data to the student model. Thus, the user profiles are only required during the training stage. Experimental results on several sentiment classification datasets show that the proposed learning framework enables student models to achieve improved performance.","PeriodicalId":186190,"journal":{"name":"2019 IEEE 35th International Conference on Data Engineering Workshops (ICDEW)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-04-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Distilling Knowledge from User Information for Document Level Sentiment Classification\",\"authors\":\"Jialing Song\",\"doi\":\"10.1109/ICDEW.2019.00-15\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Combining global user and product characteristics with local review information provides a powerful mechanism for predicting users' sentiment in a review document about a product on online review sites such as Amazon, Yelp and IMDB. However, the user information is not always available in the real scenario, for example, some new-registered users, or some sites allowing users' comments without logging in. To address this issue, we introduce a novel knowledge distillation (KD) learning paradigm, to transfer the user characteristics into the weights of student neural networks that just utilize product and review information. The teacher model transfers its predictive distributions of training data to the student model. Thus, the user profiles are only required during the training stage. Experimental results on several sentiment classification datasets show that the proposed learning framework enables student models to achieve improved performance.\",\"PeriodicalId\":186190,\"journal\":{\"name\":\"2019 IEEE 35th International Conference on Data Engineering Workshops (ICDEW)\",\"volume\":\"17 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-04-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 IEEE 35th International Conference on Data Engineering Workshops (ICDEW)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICDEW.2019.00-15\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE 35th International Conference on Data Engineering Workshops (ICDEW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDEW.2019.00-15","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Distilling Knowledge from User Information for Document Level Sentiment Classification
Combining global user and product characteristics with local review information provides a powerful mechanism for predicting users' sentiment in a review document about a product on online review sites such as Amazon, Yelp and IMDB. However, the user information is not always available in the real scenario, for example, some new-registered users, or some sites allowing users' comments without logging in. To address this issue, we introduce a novel knowledge distillation (KD) learning paradigm, to transfer the user characteristics into the weights of student neural networks that just utilize product and review information. The teacher model transfers its predictive distributions of training data to the student model. Thus, the user profiles are only required during the training stage. Experimental results on several sentiment classification datasets show that the proposed learning framework enables student models to achieve improved performance.