Efficient Methods to Improve the Performance of Supervised Learning Models

Mohit Kumar, Pragya Yadav, H. Singh, Ankita Arora
{"title":"Efficient Methods to Improve the Performance of Supervised Learning Models","authors":"Mohit Kumar, Pragya Yadav, H. Singh, Ankita Arora","doi":"10.1109/CONIT51480.2021.9498387","DOIUrl":null,"url":null,"abstract":"Supervised Learning is defined as training a model with input data that includes the result itself. There are large number of supervised learning algorithms and great number of models. Each model has its own merits and demerits and performs differently. There are many data preprocessing techniques and hence the combination of several data preprocessing techniques can increase the performance of the present supervised learning models. Primary data can not be fed directly to the learning model because it can hold a lot of noise. It needs to be preprocessed using various data preprocessing techniques. We have analysed and compared different data preprocessing techniques and their combinations. Comparison is done using various performance metrics and the combination of different data preprocessing is applied to different model. We have done categorical data handling, missing value treatment, feature scaling and feature extraction as the data preprocessing steps. Through the comparison, we found which technique is better for which type of models. We used the California census data for our study.","PeriodicalId":426131,"journal":{"name":"2021 International Conference on Intelligent Technologies (CONIT)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 International Conference on Intelligent Technologies (CONIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CONIT51480.2021.9498387","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Supervised Learning is defined as training a model with input data that includes the result itself. There are large number of supervised learning algorithms and great number of models. Each model has its own merits and demerits and performs differently. There are many data preprocessing techniques and hence the combination of several data preprocessing techniques can increase the performance of the present supervised learning models. Primary data can not be fed directly to the learning model because it can hold a lot of noise. It needs to be preprocessed using various data preprocessing techniques. We have analysed and compared different data preprocessing techniques and their combinations. Comparison is done using various performance metrics and the combination of different data preprocessing is applied to different model. We have done categorical data handling, missing value treatment, feature scaling and feature extraction as the data preprocessing steps. Through the comparison, we found which technique is better for which type of models. We used the California census data for our study.
改进监督学习模型性能的有效方法
监督式学习被定义为用包含结果本身的输入数据训练模型。有监督学习算法和模型数量众多。每个模型都有自己的优点和缺点,并且表现不同。数据预处理技术有很多,因此几种数据预处理技术的结合可以提高现有监督学习模型的性能。原始数据不能直接提供给学习模型,因为它可能包含很多噪声。它需要使用各种数据预处理技术进行预处理。我们分析和比较了不同的数据预处理技术及其组合。使用各种性能指标进行比较,并将不同的数据预处理组合应用于不同的模型。我们将分类数据处理、缺失值处理、特征缩放和特征提取作为数据预处理步骤。通过比较,我们发现哪种技术更适合哪种类型的模型。我们的研究使用了加州的人口普查数据。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信