{"title":"Do applied statisticians prefer more randomness or less? Bootstrap or Jackknife?","authors":"Yannis G. Yatracos","doi":"10.1007/s11222-024-10388-7","DOIUrl":null,"url":null,"abstract":"<p>Bootstrap and Jackknife estimates, <span>\\(T_{n,B}^*\\)</span> and <span>\\(T_{n,J},\\)</span> respectively, of a population parameter <span>\\(\\theta \\)</span> are both used in statistical computations; <i>n</i> is the sample size, <i>B</i> is the number of Bootstrap samples. For any <span>\\(n_0\\)</span> and <span>\\(B_0,\\)</span> Bootstrap samples do not add new information about <span>\\(\\theta \\)</span> being observations from the original sample and when <span>\\(B_0<\\infty ,\\)</span> <span>\\(T_{n_0,B_0}^*\\)</span> includes also resampling variability, an additional source of uncertainty not affecting <span>\\(T_{n_0, J}.\\)</span> These are neglected in theoretical papers with results for the utopian <span>\\(T_{n, \\infty }^*, \\)</span> that do not hold for <span>\\(B<\\infty .\\)</span> The consequence is that <span>\\(T^*_{n_0, B_0}\\)</span> is expected to have larger mean squared error (MSE) than <span>\\(T_{n_0,J},\\)</span> namely <span>\\(T_{n_0,B_0}^*\\)</span> is inadmissible. The amount of inadmissibility may be very large when populations’ parameters, e.g. the variance, are unbounded and/or with big data. A palliating remedy is increasing <i>B</i>, the larger the better, but the MSEs ordering remains unchanged for <span>\\(B<\\infty .\\)</span> This is confirmed theoretically when <span>\\(\\theta \\)</span> is the mean of a population, and is observed in the estimated total MSE for linear regression coefficients. In the latter, the chance the estimated total MSE with <span>\\(T_{n,B}^*\\)</span> improves that with <span>\\(T_{n,J}\\)</span> decreases to 0 as <i>B</i> increases.\n</p>","PeriodicalId":22058,"journal":{"name":"Statistics and Computing","volume":"54 1","pages":""},"PeriodicalIF":1.6000,"publicationDate":"2024-02-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Statistics and Computing","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s11222-024-10388-7","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 0
Abstract
Bootstrap and Jackknife estimates, \(T_{n,B}^*\) and \(T_{n,J},\) respectively, of a population parameter \(\theta \) are both used in statistical computations; n is the sample size, B is the number of Bootstrap samples. For any \(n_0\) and \(B_0,\) Bootstrap samples do not add new information about \(\theta \) being observations from the original sample and when \(B_0<\infty ,\)\(T_{n_0,B_0}^*\) includes also resampling variability, an additional source of uncertainty not affecting \(T_{n_0, J}.\) These are neglected in theoretical papers with results for the utopian \(T_{n, \infty }^*, \) that do not hold for \(B<\infty .\) The consequence is that \(T^*_{n_0, B_0}\) is expected to have larger mean squared error (MSE) than \(T_{n_0,J},\) namely \(T_{n_0,B_0}^*\) is inadmissible. The amount of inadmissibility may be very large when populations’ parameters, e.g. the variance, are unbounded and/or with big data. A palliating remedy is increasing B, the larger the better, but the MSEs ordering remains unchanged for \(B<\infty .\) This is confirmed theoretically when \(\theta \) is the mean of a population, and is observed in the estimated total MSE for linear regression coefficients. In the latter, the chance the estimated total MSE with \(T_{n,B}^*\) improves that with \(T_{n,J}\) decreases to 0 as B increases.
期刊介绍:
Statistics and Computing is a bi-monthly refereed journal which publishes papers covering the range of the interface between the statistical and computing sciences.
In particular, it addresses the use of statistical concepts in computing science, for example in machine learning, computer vision and data analytics, as well as the use of computers in data modelling, prediction and analysis. Specific topics which are covered include: techniques for evaluating analytically intractable problems such as bootstrap resampling, Markov chain Monte Carlo, sequential Monte Carlo, approximate Bayesian computation, search and optimization methods, stochastic simulation and Monte Carlo, graphics, computer environments, statistical approaches to software errors, information retrieval, machine learning, statistics of databases and database technology, huge data sets and big data analytics, computer algebra, graphical models, image processing, tomography, inverse problems and uncertainty quantification.
In addition, the journal contains original research reports, authoritative review papers, discussed papers, and occasional special issues on particular topics or carrying proceedings of relevant conferences. Statistics and Computing also publishes book review and software review sections.