Comparisons between Resampling Techniques in Linear Regression: A Simulation Study

A. Fitrianto, Punitha Linganathan
{"title":"Comparisons between Resampling Techniques in Linear Regression: A Simulation Study","authors":"A. Fitrianto, Punitha Linganathan","doi":"10.18860/ca.v7i3.14550","DOIUrl":null,"url":null,"abstract":"The classic methods used in estimating the parameters in linear regression need to fulfill some assumptions. If the assumptions are not fulfilled, the conclusion is questionable. Resampling is one of the ways to avoid such problems. The study aims to compare resampling techniques in linear regression. The original data used in the study is clean, without any influential observations, outliers and leverage points.  The ordinary least square method was used as the primary method to estimate the parameters and then compared with resampling techniques. The variance, p-value, bias, and standard error are used as a scale to estimate the best method among random bootstrap, residual bootstrap and delete-one Jackknife. After all the analysis took place, it was found that random bootstrap did not perform well while residual and delete-one Jackknife works quite well. Random bootstrap, residual bootstrap, and Jackknife estimate better than ordinary least square. Is was found that residual bootstrap works well in estimating the parameter in the small sample. At the same time, it is suggested to use Jackknife when the sample size is big because Jackknife is more accessible to apply than residual bootstrap and Jackknife works well when the sample size is big.","PeriodicalId":388519,"journal":{"name":"CAUCHY: Jurnal Matematika Murni dan Aplikasi","volume":"56 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"CAUCHY: Jurnal Matematika Murni dan Aplikasi","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.18860/ca.v7i3.14550","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

The classic methods used in estimating the parameters in linear regression need to fulfill some assumptions. If the assumptions are not fulfilled, the conclusion is questionable. Resampling is one of the ways to avoid such problems. The study aims to compare resampling techniques in linear regression. The original data used in the study is clean, without any influential observations, outliers and leverage points.  The ordinary least square method was used as the primary method to estimate the parameters and then compared with resampling techniques. The variance, p-value, bias, and standard error are used as a scale to estimate the best method among random bootstrap, residual bootstrap and delete-one Jackknife. After all the analysis took place, it was found that random bootstrap did not perform well while residual and delete-one Jackknife works quite well. Random bootstrap, residual bootstrap, and Jackknife estimate better than ordinary least square. Is was found that residual bootstrap works well in estimating the parameter in the small sample. At the same time, it is suggested to use Jackknife when the sample size is big because Jackknife is more accessible to apply than residual bootstrap and Jackknife works well when the sample size is big.
线性回归中重采样技术的比较:模拟研究
经典的线性回归参数估计方法需要满足一定的假设条件。如果假设不成立,结论就是可疑的。重采样是避免此类问题的方法之一。本研究旨在比较线性回归中的重采样技术。本研究使用的原始数据是干净的,没有任何有影响的观察值、异常值和杠杆点。采用普通最小二乘法作为估计参数的主要方法,并与重采样技术进行了比较。用方差、p值、偏差和标准误差作为尺度,在随机bootstrap、残差bootstrap和delete- 1 Jackknife中估计最佳方法。在所有的分析之后,发现随机引导的效果并不好,而残差和删除一刀的效果相当好。随机自举法、残差自举法和折刀估计优于普通最小二乘。结果表明,残差自举法可以很好地估计小样本情况下的参数。同时,建议在样本量大的情况下使用Jackknife,因为Jackknife比残差bootstrap更容易应用,而且在样本量大的情况下,Jackknife的效果也很好。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信