{"title":"从基本模型集成的角度介绍预测模型","authors":"Yu-Chung Peng","doi":"10.1109/CDS52072.2021.00031","DOIUrl":null,"url":null,"abstract":"The combination model is a widely used method way in machine learning to design the more sophisticated and well-performed model, which can be applied for the field of the recommendation system and so on. This paper is aims to explore the prediction model The normal prediction models have to deal with neither dense numerical features or sparse categorical features. Decision tree is a popular way to deal with dense data, doing both regression and classification. It has various famous expanding algorithms, such as GBDT, XGBoost, and so on, which is designed by different ways of combing the basic decision tree. GBDT and XGBoost are both good at Large-scale machine learning and perform well with dense numerical features. As for sparse categorical features. FM is no doubt the best basic algorithm designed for it. By combing FM with other models, Wide&Deep and DeepFM are better for the recommended system because they perform well and can deal with high-order feature intersection. Furthermore, we can not only just add models together, Deep&Cross and DeepGBM are examples which use NN (natural network) to approximate other models, combing the different parts to an integration. And NN can not only substitute the model is approximated, but also is suited for the online training. This paper is aims to explore the development of the prediction models and the future trend for its improvement.","PeriodicalId":380426,"journal":{"name":"2021 2nd International Conference on Computing and Data Science (CDS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"An Introduction of Prediction Models From the View of Integration Between Basic Models\",\"authors\":\"Yu-Chung Peng\",\"doi\":\"10.1109/CDS52072.2021.00031\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The combination model is a widely used method way in machine learning to design the more sophisticated and well-performed model, which can be applied for the field of the recommendation system and so on. This paper is aims to explore the prediction model The normal prediction models have to deal with neither dense numerical features or sparse categorical features. Decision tree is a popular way to deal with dense data, doing both regression and classification. It has various famous expanding algorithms, such as GBDT, XGBoost, and so on, which is designed by different ways of combing the basic decision tree. GBDT and XGBoost are both good at Large-scale machine learning and perform well with dense numerical features. As for sparse categorical features. FM is no doubt the best basic algorithm designed for it. By combing FM with other models, Wide&Deep and DeepFM are better for the recommended system because they perform well and can deal with high-order feature intersection. Furthermore, we can not only just add models together, Deep&Cross and DeepGBM are examples which use NN (natural network) to approximate other models, combing the different parts to an integration. And NN can not only substitute the model is approximated, but also is suited for the online training. This paper is aims to explore the development of the prediction models and the future trend for its improvement.\",\"PeriodicalId\":380426,\"journal\":{\"name\":\"2021 2nd International Conference on Computing and Data Science (CDS)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 2nd International Conference on Computing and Data Science (CDS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CDS52072.2021.00031\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 2nd International Conference on Computing and Data Science (CDS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CDS52072.2021.00031","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An Introduction of Prediction Models From the View of Integration Between Basic Models
The combination model is a widely used method way in machine learning to design the more sophisticated and well-performed model, which can be applied for the field of the recommendation system and so on. This paper is aims to explore the prediction model The normal prediction models have to deal with neither dense numerical features or sparse categorical features. Decision tree is a popular way to deal with dense data, doing both regression and classification. It has various famous expanding algorithms, such as GBDT, XGBoost, and so on, which is designed by different ways of combing the basic decision tree. GBDT and XGBoost are both good at Large-scale machine learning and perform well with dense numerical features. As for sparse categorical features. FM is no doubt the best basic algorithm designed for it. By combing FM with other models, Wide&Deep and DeepFM are better for the recommended system because they perform well and can deal with high-order feature intersection. Furthermore, we can not only just add models together, Deep&Cross and DeepGBM are examples which use NN (natural network) to approximate other models, combing the different parts to an integration. And NN can not only substitute the model is approximated, but also is suited for the online training. This paper is aims to explore the development of the prediction models and the future trend for its improvement.