从基本模型集成的角度介绍预测模型

Yu-Chung Peng
{"title":"从基本模型集成的角度介绍预测模型","authors":"Yu-Chung Peng","doi":"10.1109/CDS52072.2021.00031","DOIUrl":null,"url":null,"abstract":"The combination model is a widely used method way in machine learning to design the more sophisticated and well-performed model, which can be applied for the field of the recommendation system and so on. This paper is aims to explore the prediction model The normal prediction models have to deal with neither dense numerical features or sparse categorical features. Decision tree is a popular way to deal with dense data, doing both regression and classification. It has various famous expanding algorithms, such as GBDT, XGBoost, and so on, which is designed by different ways of combing the basic decision tree. GBDT and XGBoost are both good at Large-scale machine learning and perform well with dense numerical features. As for sparse categorical features. FM is no doubt the best basic algorithm designed for it. By combing FM with other models, Wide&Deep and DeepFM are better for the recommended system because they perform well and can deal with high-order feature intersection. Furthermore, we can not only just add models together, Deep&Cross and DeepGBM are examples which use NN (natural network) to approximate other models, combing the different parts to an integration. And NN can not only substitute the model is approximated, but also is suited for the online training. This paper is aims to explore the development of the prediction models and the future trend for its improvement.","PeriodicalId":380426,"journal":{"name":"2021 2nd International Conference on Computing and Data Science (CDS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"An Introduction of Prediction Models From the View of Integration Between Basic Models\",\"authors\":\"Yu-Chung Peng\",\"doi\":\"10.1109/CDS52072.2021.00031\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The combination model is a widely used method way in machine learning to design the more sophisticated and well-performed model, which can be applied for the field of the recommendation system and so on. This paper is aims to explore the prediction model The normal prediction models have to deal with neither dense numerical features or sparse categorical features. Decision tree is a popular way to deal with dense data, doing both regression and classification. It has various famous expanding algorithms, such as GBDT, XGBoost, and so on, which is designed by different ways of combing the basic decision tree. GBDT and XGBoost are both good at Large-scale machine learning and perform well with dense numerical features. As for sparse categorical features. FM is no doubt the best basic algorithm designed for it. By combing FM with other models, Wide&Deep and DeepFM are better for the recommended system because they perform well and can deal with high-order feature intersection. Furthermore, we can not only just add models together, Deep&Cross and DeepGBM are examples which use NN (natural network) to approximate other models, combing the different parts to an integration. And NN can not only substitute the model is approximated, but also is suited for the online training. This paper is aims to explore the development of the prediction models and the future trend for its improvement.\",\"PeriodicalId\":380426,\"journal\":{\"name\":\"2021 2nd International Conference on Computing and Data Science (CDS)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 2nd International Conference on Computing and Data Science (CDS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CDS52072.2021.00031\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 2nd International Conference on Computing and Data Science (CDS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CDS52072.2021.00031","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

组合模型是机器学习中广泛使用的一种方法,可以设计出更复杂、性能更好的模型,可以应用于推荐系统等领域。常规的预测模型既不能处理密集的数值特征,也不能处理稀疏的分类特征。决策树是处理密集数据的一种流行方法,它既可以进行回归也可以进行分类。它有各种著名的扩展算法,如GBDT、XGBoost等,这些算法都是通过对基本决策树进行不同的梳理来设计的。GBDT和XGBoost都擅长大规模机器学习,并且在密集的数值特征上表现良好。对于稀疏的分类特征。调频算法无疑是为其设计的最好的基本算法。将FM与其他模型相结合,发现Wide&Deep和DeepFM表现较好,可以处理高阶特征交集。此外,我们不仅可以将模型加在一起,Deep&Cross和DeepGBM就是使用NN(自然网络)来近似其他模型的例子,将不同的部分组合成一个整体。神经网络不仅可以代替模型的逼近,而且适合于在线训练。本文旨在探讨预测模型的发展及其改进的未来趋势。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
An Introduction of Prediction Models From the View of Integration Between Basic Models
The combination model is a widely used method way in machine learning to design the more sophisticated and well-performed model, which can be applied for the field of the recommendation system and so on. This paper is aims to explore the prediction model The normal prediction models have to deal with neither dense numerical features or sparse categorical features. Decision tree is a popular way to deal with dense data, doing both regression and classification. It has various famous expanding algorithms, such as GBDT, XGBoost, and so on, which is designed by different ways of combing the basic decision tree. GBDT and XGBoost are both good at Large-scale machine learning and perform well with dense numerical features. As for sparse categorical features. FM is no doubt the best basic algorithm designed for it. By combing FM with other models, Wide&Deep and DeepFM are better for the recommended system because they perform well and can deal with high-order feature intersection. Furthermore, we can not only just add models together, Deep&Cross and DeepGBM are examples which use NN (natural network) to approximate other models, combing the different parts to an integration. And NN can not only substitute the model is approximated, but also is suited for the online training. This paper is aims to explore the development of the prediction models and the future trend for its improvement.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信