Olsavs: A New Algorithm For Model Selection

Nicklaus T. Hicks, Hasthika S. Rupasinghe Arachchige Don
{"title":"Olsavs: A New Algorithm For Model Selection","authors":"Nicklaus T. Hicks, Hasthika S. Rupasinghe Arachchige Don","doi":"10.5539/ijsp.v12n2p28","DOIUrl":null,"url":null,"abstract":"The shrinkage methods such as Lasso and Relaxed Lasso introduce some bias in order to reduce the variance of the regression coefficients in multiple linear regression models. One way to reduce bias after shrinkage of the coefficients would be to apply ordinary least squares to the subset of predictors selected by the shrinkage method used. This work extensively investigated this idea and developed a new variable selection algorithm. The authors named this technique OLSAVS (Ordinary Least Squares After Variable Selection). The OLSAVS algorithm was implemented in R. Simulations were used to illustrate that the new method is able to produce better predictions with less bias for various error distributions. The OLSAVS method was compared with a few widely used shrinkage methods in terms of their achieved test root mean square error and bias.","PeriodicalId":89781,"journal":{"name":"International journal of statistics and probability","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International journal of statistics and probability","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5539/ijsp.v12n2p28","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

The shrinkage methods such as Lasso and Relaxed Lasso introduce some bias in order to reduce the variance of the regression coefficients in multiple linear regression models. One way to reduce bias after shrinkage of the coefficients would be to apply ordinary least squares to the subset of predictors selected by the shrinkage method used. This work extensively investigated this idea and developed a new variable selection algorithm. The authors named this technique OLSAVS (Ordinary Least Squares After Variable Selection). The OLSAVS algorithm was implemented in R. Simulations were used to illustrate that the new method is able to produce better predictions with less bias for various error distributions. The OLSAVS method was compared with a few widely used shrinkage methods in terms of their achieved test root mean square error and bias.
Olsavs:一种新的模型选择算法
为了减小多元线性回归模型中回归系数的方差,Lasso和relax Lasso等收缩方法引入了一定的偏差。减少系数收缩后偏差的一种方法是将普通最小二乘应用于使用收缩方法选择的预测因子子集。本文对这一思想进行了广泛的研究,并提出了一种新的变量选择算法。作者将这种技术命名为OLSAVS(变量选择后的普通最小二乘法)。仿真结果表明,该方法能够在各种误差分布下产生更好的预测结果,偏差更小。将OLSAVS方法与几种广泛使用的收缩方法进行了比较,以获得检验均方根误差和偏差。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信