Computational Statistics最新文献

筛选
英文 中文
A sparse estimate based on variational approximations for semiparametric generalized additive models 基于半参数广义加法模型变分近似的稀疏估计
IF 1.3 4区 数学
Computational Statistics Pub Date : 2024-03-25 DOI: 10.1007/s00180-024-01485-2
Fan Yang, Yuehan Yang
{"title":"A sparse estimate based on variational approximations for semiparametric generalized additive models","authors":"Fan Yang, Yuehan Yang","doi":"10.1007/s00180-024-01485-2","DOIUrl":"https://doi.org/10.1007/s00180-024-01485-2","url":null,"abstract":"","PeriodicalId":55223,"journal":{"name":"Computational Statistics","volume":null,"pages":null},"PeriodicalIF":1.3,"publicationDate":"2024-03-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140382472","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Likelihood inference for unified transformation cure model with interval censored data 有区间删失数据的统一转换固化模型的似然推理
IF 1.3 4区 数学
Computational Statistics Pub Date : 2024-03-25 DOI: 10.1007/s00180-024-01480-7
Jodi Treszoks, S. Pal
{"title":"Likelihood inference for unified transformation cure model with interval censored data","authors":"Jodi Treszoks, S. Pal","doi":"10.1007/s00180-024-01480-7","DOIUrl":"https://doi.org/10.1007/s00180-024-01480-7","url":null,"abstract":"","PeriodicalId":55223,"journal":{"name":"Computational Statistics","volume":null,"pages":null},"PeriodicalIF":1.3,"publicationDate":"2024-03-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140382681","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Which C compiler and BLAS/LAPACK library should I use: gretl’s numerical efficiency in different configurations 我应该使用哪种 C 编译器和 BLAS/LAPACK 库:不同配置下的 gretl 数值效率
IF 1.3 4区 数学
Computational Statistics Pub Date : 2024-03-22 DOI: 10.1007/s00180-024-01461-w
Marcin Błażejowski
{"title":"Which C compiler and BLAS/LAPACK library should I use: gretl’s numerical efficiency in different configurations","authors":"Marcin Błażejowski","doi":"10.1007/s00180-024-01461-w","DOIUrl":"https://doi.org/10.1007/s00180-024-01461-w","url":null,"abstract":"","PeriodicalId":55223,"journal":{"name":"Computational Statistics","volume":null,"pages":null},"PeriodicalIF":1.3,"publicationDate":"2024-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140215260","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Simultaneous confidence bands for multiple comparisons of several percentile lines 多个百分位数线多重比较的同步置信带
IF 1.3 4区 数学
Computational Statistics Pub Date : 2024-03-15 DOI: 10.1007/s00180-024-01481-6
Sanyu Zhou, Yu Zhang
{"title":"Simultaneous confidence bands for multiple comparisons of several percentile lines","authors":"Sanyu Zhou, Yu Zhang","doi":"10.1007/s00180-024-01481-6","DOIUrl":"https://doi.org/10.1007/s00180-024-01481-6","url":null,"abstract":"","PeriodicalId":55223,"journal":{"name":"Computational Statistics","volume":null,"pages":null},"PeriodicalIF":1.3,"publicationDate":"2024-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140238203","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Robust estimation of functional factor models with functional pairwise spatial signs 具有功能对空间符号的功能因子模型的稳健估计
IF 1.3 4区 数学
Computational Statistics Pub Date : 2024-03-13 DOI: 10.1007/s00180-024-01477-2
Shuquan Yang, N. Ling
{"title":"Robust estimation of functional factor models with functional pairwise spatial signs","authors":"Shuquan Yang, N. Ling","doi":"10.1007/s00180-024-01477-2","DOIUrl":"https://doi.org/10.1007/s00180-024-01477-2","url":null,"abstract":"","PeriodicalId":55223,"journal":{"name":"Computational Statistics","volume":null,"pages":null},"PeriodicalIF":1.3,"publicationDate":"2024-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140246980","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A subspace aggregating algorithm for accurate classification 用于精确分类的子空间聚合算法
IF 1.3 4区 数学
Computational Statistics Pub Date : 2024-03-09 DOI: 10.1007/s00180-024-01476-3
Saeid Amiri, Reza Modarres
{"title":"A subspace aggregating algorithm for accurate classification","authors":"Saeid Amiri, Reza Modarres","doi":"10.1007/s00180-024-01476-3","DOIUrl":"https://doi.org/10.1007/s00180-024-01476-3","url":null,"abstract":"<p>We present a technique for learning via aggregation in supervised classification. The new method improves classification performance, regardless of which classifier is at its core. This approach exploits the information hidden in subspaces by combinations of aggregating variables and is applicable to high-dimensional data sets. We provide algorithms that randomly divide the variables into smaller subsets and permute them before applying a classification method to each subset. We combine the resulting classes to predict the class membership. Theoretical and simulation analyses consistently demonstrate the high accuracy of our classification methods. In comparison to aggregating observations through sampling, our approach proves to be significantly more effective. Through extensive simulations, we evaluate the accuracy of various classification methods. To further illustrate the effectiveness of our techniques, we apply them to five real-world data sets.</p>","PeriodicalId":55223,"journal":{"name":"Computational Statistics","volume":null,"pages":null},"PeriodicalIF":1.3,"publicationDate":"2024-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140075811","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Imbalanced data sampling design based on grid boundary domain for big data 基于网格边界域的大数据不平衡数据采样设计
IF 1.3 4区 数学
Computational Statistics Pub Date : 2024-03-08 DOI: 10.1007/s00180-024-01471-8
{"title":"Imbalanced data sampling design based on grid boundary domain for big data","authors":"","doi":"10.1007/s00180-024-01471-8","DOIUrl":"https://doi.org/10.1007/s00180-024-01471-8","url":null,"abstract":"<h3>Abstract</h3> <p>The data distribution is often associated with a <em>priori</em>-known probability, and the occurrence probability of interest events is small, so a large amount of imbalanced data appears in sociology, economics, engineering, and various other fields. The existing over- and under-sampling methods are widely used in imbalanced data classification problems, but over-sampling leads to overfitting, and under-sampling ignores the effective information. We propose a new sampling design algorithm called the neighbor grid of boundary mixed-sampling (NGBM), which focuses on the boundary information. This paper obtains the classification boundary information through grid boundary domain identification, thereby determining the importance of the samples. Based on this premise, the synthetic minority oversampling technique is applied to the boundary grid, and random under-sampling is applied to the other grids. With the help of this mixed sampling strategy, more important classification boundary information, especially for positive sample information identification is extracted. Numerical simulations and real data analysis are used to discuss the parameter-setting strategy of the NGBM and illustrate the advantages of the proposed NGBM in the imbalanced data, as well as practical applications.</p>","PeriodicalId":55223,"journal":{"name":"Computational Statistics","volume":null,"pages":null},"PeriodicalIF":1.3,"publicationDate":"2024-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140075873","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Sparse estimation of linear model via Bayesian method $$^*$$ 通过贝叶斯方法对线性模型进行稀疏估计 $$^*$$
IF 1.3 4区 数学
Computational Statistics Pub Date : 2024-03-04 DOI: 10.1007/s00180-024-01474-5
{"title":"Sparse estimation of linear model via Bayesian method $$^*$$","authors":"","doi":"10.1007/s00180-024-01474-5","DOIUrl":"https://doi.org/10.1007/s00180-024-01474-5","url":null,"abstract":"<h3>Abstract</h3> <p>This paper considers the sparse estimation problem of regression coefficients in the linear model. Note that the global–local shrinkage priors do not allow the regression coefficients to be truly estimated as zero, we propose three threshold rules and compare their contraction properties, and also tandem those rules with the popular horseshoe prior and the horseshoe+ prior that are normally regarded as global–local shrinkage priors. The hierarchical prior expressions for the horseshoe prior and the horseshoe+ prior are obtained, and the full conditional posterior distributions for all parameters for algorithm implementation are also given. Simulation studies indicate that the horseshoe/horseshoe+ prior with the threshold rules are both superior to the spike-slab models. Finally, a real data analysis demonstrates the effectiveness of variable selection of the proposed method.</p>","PeriodicalId":55223,"journal":{"name":"Computational Statistics","volume":null,"pages":null},"PeriodicalIF":1.3,"publicationDate":"2024-03-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140036222","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Degree selection methods for curve estimation via Bernstein polynomials 通过伯恩斯坦多项式进行曲线估算的度数选择方法
IF 1.3 4区 数学
Computational Statistics Pub Date : 2024-03-02 DOI: 10.1007/s00180-024-01473-6
{"title":"Degree selection methods for curve estimation via Bernstein polynomials","authors":"","doi":"10.1007/s00180-024-01473-6","DOIUrl":"https://doi.org/10.1007/s00180-024-01473-6","url":null,"abstract":"<h3>Abstract</h3> <p>Bernstein Polynomial (BP) bases can uniformly approximate any continuous function based on observed noisy samples. However, a persistent challenge is the data-driven selection of a suitable degree for the BPs. In the absence of noise, asymptotic theory suggests that a larger degree leads to better approximation. However, in the presence of noise, which reduces bias, a larger degree also results in larger variances due to high-dimensional parameter estimation. Thus, a balance in the classic bias-variance trade-off is essential. The main objective of this work is to determine the minimum possible degree of the approximating BPs using probabilistic methods that are robust to various shapes of an unknown continuous function. Beyond offering theoretical guidance, the paper includes numerical illustrations to address the issue of determining a suitable degree for BPs in approximating arbitrary continuous functions.</p>","PeriodicalId":55223,"journal":{"name":"Computational Statistics","volume":null,"pages":null},"PeriodicalIF":1.3,"publicationDate":"2024-03-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140016810","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Automatic piecewise linear regression 自动片断线性回归
IF 1.3 4区 数学
Computational Statistics Pub Date : 2024-03-01 DOI: 10.1007/s00180-024-01475-4
Mathias von Ottenbreit, Riccardo De Bin
{"title":"Automatic piecewise linear regression","authors":"Mathias von Ottenbreit, Riccardo De Bin","doi":"10.1007/s00180-024-01475-4","DOIUrl":"https://doi.org/10.1007/s00180-024-01475-4","url":null,"abstract":"<p>Regression modelling often presents a trade-off between predictiveness and interpretability. Highly predictive and popular tree-based algorithms such as Random Forest and boosted trees predict very well the outcome of new observations, but the effect of the predictors on the result is hard to interpret. Highly interpretable algorithms like linear effect-based boosting and MARS, on the other hand, are typically less predictive. Here we propose a novel regression algorithm, automatic piecewise linear regression (APLR), that combines the predictiveness of a boosting algorithm with the interpretability of a MARS model. In addition, as a boosting algorithm, it automatically handles variable selection, and, as a MARS-based approach, it takes into account non-linear relationships and possible interaction terms. We show on simulated and real data examples how APLR’s performance is comparable to that of the top-performing approaches in terms of prediction, while offering an easy way to interpret the results. APLR has been implemented in C++ and wrapped in a Python package as a Scikit-learn compatible estimator.</p>","PeriodicalId":55223,"journal":{"name":"Computational Statistics","volume":null,"pages":null},"PeriodicalIF":1.3,"publicationDate":"2024-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140016808","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信