{"title":"Improving Self-Attention Based News Recommendation with Document Classification","authors":"Hao Ke","doi":"10.1109/ICMLC51923.2020.9469577","DOIUrl":null,"url":null,"abstract":"Online news services have become the first choice to read news for many internet users. However, thousands of news articles are released and updated on a daily basis, which makes it impossible for users to select relevant and intriguing stories by themselves. The news recommendation models are developed to tackle information overload. News stories on various topics are recommended to users from diversified backgrounds by an automated system. In this paper, we propose a neural news recommendation model with self-attention jointly trained by document classification, SARC. The self-attention mechanism captures the long-term relationships among words. The joint training of recommendation and classification improves representation and generalization capability. We demonstrate our model’s superior performances over other state-of-the-art baselines on a large-scale news recommendation dataset.","PeriodicalId":170815,"journal":{"name":"2020 International Conference on Machine Learning and Cybernetics (ICMLC)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 International Conference on Machine Learning and Cybernetics (ICMLC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMLC51923.2020.9469577","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Online news services have become the first choice to read news for many internet users. However, thousands of news articles are released and updated on a daily basis, which makes it impossible for users to select relevant and intriguing stories by themselves. The news recommendation models are developed to tackle information overload. News stories on various topics are recommended to users from diversified backgrounds by an automated system. In this paper, we propose a neural news recommendation model with self-attention jointly trained by document classification, SARC. The self-attention mechanism captures the long-term relationships among words. The joint training of recommendation and classification improves representation and generalization capability. We demonstrate our model’s superior performances over other state-of-the-art baselines on a large-scale news recommendation dataset.