{"title":"协同过滤的混合超参数优化","authors":"Peter Szabó, B. Genge","doi":"10.1109/SYNASC51798.2020.00042","DOIUrl":null,"url":null,"abstract":"Collaborative filtering (CF) became a prevalent technique to filter objects a user might like, based on other users' reactions. The neural network based solutions for CF rely on hyper-parameters to control the learning process. This paper documents a solution for hyper-parameter optimization (HPO). We empirically prove that optimizing the hyperparameters leads to a significant performance gain. Moreover, we show a method to streamline HPO while substantially reducing computation time. Our solution relies on the separation of hyper-parameters into two groups, predetermined and automatically optimizable parameters. By minimizing the later, we can significantly reduce the overall time needed for HPO. After an extensive experimental analysis, the method produced significantly better results than manual HPO in the context of a real-world dataset.","PeriodicalId":278104,"journal":{"name":"2020 22nd International Symposium on Symbolic and Numeric Algorithms for Scientific Computing (SYNASC)","volume":"13 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-08-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Hybrid Hyper-parameter Optimization for Collaborative Filtering\",\"authors\":\"Peter Szabó, B. Genge\",\"doi\":\"10.1109/SYNASC51798.2020.00042\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Collaborative filtering (CF) became a prevalent technique to filter objects a user might like, based on other users' reactions. The neural network based solutions for CF rely on hyper-parameters to control the learning process. This paper documents a solution for hyper-parameter optimization (HPO). We empirically prove that optimizing the hyperparameters leads to a significant performance gain. Moreover, we show a method to streamline HPO while substantially reducing computation time. Our solution relies on the separation of hyper-parameters into two groups, predetermined and automatically optimizable parameters. By minimizing the later, we can significantly reduce the overall time needed for HPO. After an extensive experimental analysis, the method produced significantly better results than manual HPO in the context of a real-world dataset.\",\"PeriodicalId\":278104,\"journal\":{\"name\":\"2020 22nd International Symposium on Symbolic and Numeric Algorithms for Scientific Computing (SYNASC)\",\"volume\":\"13 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-08-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 22nd International Symposium on Symbolic and Numeric Algorithms for Scientific Computing (SYNASC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SYNASC51798.2020.00042\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 22nd International Symposium on Symbolic and Numeric Algorithms for Scientific Computing (SYNASC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SYNASC51798.2020.00042","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Hybrid Hyper-parameter Optimization for Collaborative Filtering
Collaborative filtering (CF) became a prevalent technique to filter objects a user might like, based on other users' reactions. The neural network based solutions for CF rely on hyper-parameters to control the learning process. This paper documents a solution for hyper-parameter optimization (HPO). We empirically prove that optimizing the hyperparameters leads to a significant performance gain. Moreover, we show a method to streamline HPO while substantially reducing computation time. Our solution relies on the separation of hyper-parameters into two groups, predetermined and automatically optimizable parameters. By minimizing the later, we can significantly reduce the overall time needed for HPO. After an extensive experimental analysis, the method produced significantly better results than manual HPO in the context of a real-world dataset.