Ke Ji, Y. Yuan, R. Sun, Kun Ma, Zhenxiang Chen, Jian Liu
{"title":"不确定评级数据下基于bagging的推荐集成方法","authors":"Ke Ji, Y. Yuan, R. Sun, Kun Ma, Zhenxiang Chen, Jian Liu","doi":"10.1109/SPAC46244.2018.8965431","DOIUrl":null,"url":null,"abstract":"Matrix factorization (MF) is one of the most-used techniques to build recommender systems. However, in practical use, the existence of noise in the training set brings some uncertainty, degrading the performance of MF approaches. In this paper, we propose a Bagging-based MF framework, an ensemble method of using multiple MF-based models to improve the stability and accuracy. Specifically, our framework first rebuilds new training sets by resampling on the original ratings, then takes advantage of the sets to train MF models and finally combines the predictions from the models in a ensemble way. The experiment results on real data show that our framework can achieve some performance improvement when having noise samples.","PeriodicalId":360369,"journal":{"name":"2018 International Conference on Security, Pattern Analysis, and Cybernetics (SPAC)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Bagging-based ensemble method for recommendations under uncertain rating data\",\"authors\":\"Ke Ji, Y. Yuan, R. Sun, Kun Ma, Zhenxiang Chen, Jian Liu\",\"doi\":\"10.1109/SPAC46244.2018.8965431\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Matrix factorization (MF) is one of the most-used techniques to build recommender systems. However, in practical use, the existence of noise in the training set brings some uncertainty, degrading the performance of MF approaches. In this paper, we propose a Bagging-based MF framework, an ensemble method of using multiple MF-based models to improve the stability and accuracy. Specifically, our framework first rebuilds new training sets by resampling on the original ratings, then takes advantage of the sets to train MF models and finally combines the predictions from the models in a ensemble way. The experiment results on real data show that our framework can achieve some performance improvement when having noise samples.\",\"PeriodicalId\":360369,\"journal\":{\"name\":\"2018 International Conference on Security, Pattern Analysis, and Cybernetics (SPAC)\",\"volume\":\"29 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 International Conference on Security, Pattern Analysis, and Cybernetics (SPAC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SPAC46244.2018.8965431\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 International Conference on Security, Pattern Analysis, and Cybernetics (SPAC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SPAC46244.2018.8965431","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A Bagging-based ensemble method for recommendations under uncertain rating data
Matrix factorization (MF) is one of the most-used techniques to build recommender systems. However, in practical use, the existence of noise in the training set brings some uncertainty, degrading the performance of MF approaches. In this paper, we propose a Bagging-based MF framework, an ensemble method of using multiple MF-based models to improve the stability and accuracy. Specifically, our framework first rebuilds new training sets by resampling on the original ratings, then takes advantage of the sets to train MF models and finally combines the predictions from the models in a ensemble way. The experiment results on real data show that our framework can achieve some performance improvement when having noise samples.