Statistics and Computing最新文献

筛选
英文 中文
A communication-efficient, online changepoint detection method for monitoring distributed sensor networks 用于监测分布式传感器网络的通信效率高的在线变化点检测方法
IF 2.2 2区 数学
Statistics and Computing Pub Date : 2024-04-14 DOI: 10.1007/s11222-024-10428-2
Ziyang Yang, Idris A. Eckley, Paul Fearnhead
{"title":"A communication-efficient, online changepoint detection method for monitoring distributed sensor networks","authors":"Ziyang Yang, Idris A. Eckley, Paul Fearnhead","doi":"10.1007/s11222-024-10428-2","DOIUrl":"https://doi.org/10.1007/s11222-024-10428-2","url":null,"abstract":"<p>We consider the challenge of efficiently detecting changes within a network of sensors, where we also need to minimise communication between sensors and the cloud. We propose an online, communication-efficient method to detect such changes. The procedure works by performing likelihood ratio tests at each time point, and two thresholds are chosen to filter unimportant test statistics and make decisions based on the aggregated test statistics respectively. We provide asymptotic theory concerning consistency and the asymptotic distribution if there are no changes. Simulation results suggest that our method can achieve similar performance to the idealised setting, where we have no constraints on communication between sensors, but substantially reduce the transmission costs.\u0000</p>","PeriodicalId":22058,"journal":{"name":"Statistics and Computing","volume":"1 1","pages":""},"PeriodicalIF":2.2,"publicationDate":"2024-04-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140573112","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Parsimonious consensus hierarchies, partitions and fuzzy partitioning of a set of hierarchies 一组层次结构的准共识层次结构、分区和模糊分区
IF 2.2 2区 数学
Statistics and Computing Pub Date : 2024-04-12 DOI: 10.1007/s11222-024-10415-7
Ilaria Bombelli, Maurizio Vichi
{"title":"Parsimonious consensus hierarchies, partitions and fuzzy partitioning of a set of hierarchies","authors":"Ilaria Bombelli, Maurizio Vichi","doi":"10.1007/s11222-024-10415-7","DOIUrl":"https://doi.org/10.1007/s11222-024-10415-7","url":null,"abstract":"<p>Methodology is described for fitting a fuzzy partition and a parsimonious consensus hierarchy (ultrametric matrix) to a set of hierarchies of the same set of objects. A model defining a fuzzy partition of a set of hierarchical classifications, with every class of the partition synthesized by a parsimonious consensus hierarchy is described. Each consensus includes an optimal consensus hard partition of objects and all the hierarchical agglomerative aggregations among the clusters of the consensus partition. The performances of the methodology are illustrated by an extended simulation study and applications to real data. A discussion is provided on the new methodology and some interesting future developments are described.</p>","PeriodicalId":22058,"journal":{"name":"Statistics and Computing","volume":"109 1","pages":""},"PeriodicalIF":2.2,"publicationDate":"2024-04-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140573123","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Reversed particle filtering for hidden markov models 隐马尔可夫模型的反向粒子滤波
IF 2.2 2区 数学
Statistics and Computing Pub Date : 2024-04-08 DOI: 10.1007/s11222-024-10426-4
Frank Rotiroti, Stephen G. Walker
{"title":"Reversed particle filtering for hidden markov models","authors":"Frank Rotiroti, Stephen G. Walker","doi":"10.1007/s11222-024-10426-4","DOIUrl":"https://doi.org/10.1007/s11222-024-10426-4","url":null,"abstract":"<p>We present an approach to selecting the distributions in sampling-resampling which improves the efficiency of the weighted bootstrap. To complement the standard scheme of sampling from the prior and reweighting with the likelihood, we introduce a reversed scheme, which samples from the (normalized) likelihood and reweights with the prior. We begin with some motivating examples, before developing the relevant theory. We then apply the approach to the particle filtering of time series, including nonlinear and non-Gaussian Bayesian state-space models, a task that demands efficiency, given the repeated application of the weighted bootstrap. Through simulation studies on a normal dynamic linear model, Poisson hidden Markov model, and stochastic volatility model, we demonstrate the gains in efficiency obtained by the approach, involving the choice of the standard or reversed filter. In addition, for the stochastic volatility model, we provide three real-data examples, including a comparison with importance sampling methods that attempt to incorporate information about the data indirectly into the standard filtering scheme and an extension to multivariate models. We determine that the reversed filtering scheme offers an advantage over such auxiliary methods owing to its ability to incorporate information about the data directly into the sampling, an ability that further facilitates its performance in higher-dimensional settings.</p>","PeriodicalId":22058,"journal":{"name":"Statistics and Computing","volume":"2015 1","pages":""},"PeriodicalIF":2.2,"publicationDate":"2024-04-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140573264","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Correction to: Bayesian high-dimensional covariate selection in non-linear mixed-effects models using the SAEM algorithm 更正:使用 SAEM 算法在非线性混合效应模型中进行贝叶斯高维协变量选择
IF 2.2 2区 数学
Statistics and Computing Pub Date : 2024-04-08 DOI: 10.1007/s11222-024-10421-9
Marion Naveau, Guillaume Kon Kam King, Renaud Rincent, Laure Sansonnet, Maud Delattre
{"title":"Correction to: Bayesian high-dimensional covariate selection in non-linear mixed-effects models using the SAEM algorithm","authors":"Marion Naveau, Guillaume Kon Kam King, Renaud Rincent, Laure Sansonnet, Maud Delattre","doi":"10.1007/s11222-024-10421-9","DOIUrl":"https://doi.org/10.1007/s11222-024-10421-9","url":null,"abstract":"","PeriodicalId":22058,"journal":{"name":"Statistics and Computing","volume":"15 1","pages":""},"PeriodicalIF":2.2,"publicationDate":"2024-04-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140573247","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Screen then select: a strategy for correlated predictors in high-dimensional quantile regression 先筛选后选择:高维量子回归中相关预测因子的策略
IF 2.2 2区 数学
Statistics and Computing Pub Date : 2024-04-08 DOI: 10.1007/s11222-024-10424-6
Xuejun Jiang, Yakun Liang, Haofeng Wang
{"title":"Screen then select: a strategy for correlated predictors in high-dimensional quantile regression","authors":"Xuejun Jiang, Yakun Liang, Haofeng Wang","doi":"10.1007/s11222-024-10424-6","DOIUrl":"https://doi.org/10.1007/s11222-024-10424-6","url":null,"abstract":"<p>Strong correlation among predictors and heavy-tailed noises pose a great challenge in the analysis of ultra-high dimensional data. Such challenge leads to an increase in the computation time for discovering active variables and a decrease in selection accuracy. To address this issue, we propose an innovative two-stage screen-then-select approach and its derivative procedure based on a robust quantile regression with sparsity assumption. This approach initially screens important features by ranking quantile ridge estimation and subsequently employs a likelihood-based post-screening selection strategy to refine variable selection. Additionally, we conduct an internal competition mechanism along the greedy search path to enhance the robustness of algorithm against the design dependence. Our methods are simple to implement and possess numerous desirable properties from theoretical and computational standpoints. Theoretically, we establish the strong consistency of feature selection for the proposed methods under some regularity conditions. In empirical studies, we assess the finite sample performance of our methods by comparing them with utility screening approaches and existing penalized quantile regression methods. Furthermore, we apply our methods to identify genes associated with anticancer drug sensitivities for practical guidance.</p>","PeriodicalId":22058,"journal":{"name":"Statistics and Computing","volume":"53 1","pages":""},"PeriodicalIF":2.2,"publicationDate":"2024-04-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140573257","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
R-VGAL: a sequential variational Bayes algorithm for generalised linear mixed models R-VGAL:广义线性混合模型的顺序变异贝叶斯算法
IF 2.2 2区 数学
Statistics and Computing Pub Date : 2024-04-06 DOI: 10.1007/s11222-024-10422-8
Bao Anh Vu, David Gunawan, Andrew Zammit-Mangion
{"title":"R-VGAL: a sequential variational Bayes algorithm for generalised linear mixed models","authors":"Bao Anh Vu, David Gunawan, Andrew Zammit-Mangion","doi":"10.1007/s11222-024-10422-8","DOIUrl":"https://doi.org/10.1007/s11222-024-10422-8","url":null,"abstract":"<p>Models with random effects, such as generalised linear mixed models (GLMMs), are often used for analysing clustered data. Parameter inference with these models is difficult because of the presence of cluster-specific random effects, which must be integrated out when evaluating the likelihood function. Here, we propose a sequential variational Bayes algorithm, called Recursive Variational Gaussian Approximation for Latent variable models (R-VGAL), for estimating parameters in GLMMs. The R-VGAL algorithm operates on the data sequentially, requires only a single pass through the data, and can provide parameter updates as new data are collected without the need of re-processing the previous data. At each update, the R-VGAL algorithm requires the gradient and Hessian of a “partial” log-likelihood function evaluated at the new observation, which are generally not available in closed form for GLMMs. To circumvent this issue, we propose using an importance-sampling-based approach for estimating the gradient and Hessian via Fisher’s and Louis’ identities. We find that R-VGAL can be unstable when traversing the first few data points, but that this issue can be mitigated by introducing a damping factor in the initial steps of the algorithm. Through illustrations on both simulated and real datasets, we show that R-VGAL provides good approximations to posterior distributions, that it can be made robust through damping, and that it is computationally efficient.</p>","PeriodicalId":22058,"journal":{"name":"Statistics and Computing","volume":"35 1","pages":""},"PeriodicalIF":2.2,"publicationDate":"2024-04-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140573144","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Automated generation of initial points for adaptive rejection sampling of log-concave distributions 自动生成对数凹分布自适应剔除采样的初始点
IF 2.2 2区 数学
Statistics and Computing Pub Date : 2024-04-05 DOI: 10.1007/s11222-024-10425-5
Jonathan James
{"title":"Automated generation of initial points for adaptive rejection sampling of log-concave distributions","authors":"Jonathan James","doi":"10.1007/s11222-024-10425-5","DOIUrl":"https://doi.org/10.1007/s11222-024-10425-5","url":null,"abstract":"<p>Adaptive rejection sampling requires that users provide points that span the distribution’s mode. If these points are far from the mode, it significantly increases computational costs. This paper introduces a simple, automated approach for selecting initial points that uses numerical optimization to quickly bracket the mode. When an initial point is given that resides in a high-density area, the method often requires just four function evaluations to draw a sample—just one more than the sampler’s minimum. This feature makes it well-suited for Gibbs sampling, where the previous round’s draw can serve as the starting point.\u0000</p>","PeriodicalId":22058,"journal":{"name":"Statistics and Computing","volume":"2015 1","pages":""},"PeriodicalIF":2.2,"publicationDate":"2024-04-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140573359","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Parsimonious ultrametric Gaussian mixture models 解析超参量高斯混合物模型
IF 2.2 2区 数学
Statistics and Computing Pub Date : 2024-04-01 DOI: 10.1007/s11222-024-10405-9
Carlo Cavicchia, Maurizio Vichi, Giorgia Zaccaria
{"title":"Parsimonious ultrametric Gaussian mixture models","authors":"Carlo Cavicchia, Maurizio Vichi, Giorgia Zaccaria","doi":"10.1007/s11222-024-10405-9","DOIUrl":"https://doi.org/10.1007/s11222-024-10405-9","url":null,"abstract":"<p>Gaussian mixture models represent a conceptually and mathematically elegant class of models for casting the density of a heterogeneous population where the observed data is collected from a population composed of a finite set of <i>G</i> homogeneous subpopulations with a Gaussian distribution. A limitation of these models is that they suffer from the curse of dimensionality, and the number of parameters becomes easily extremely large in the presence of high-dimensional data. In this paper, we propose a class of parsimonious Gaussian mixture models with constrained extended ultrametric covariance structures that are capable of exploring hierarchical relations among variables. The proposal shows to require a reduced number of parameters to be fit and includes constrained covariance structures across and within components that further reduce the number of parameters of the model.</p>","PeriodicalId":22058,"journal":{"name":"Statistics and Computing","volume":"19 1","pages":""},"PeriodicalIF":2.2,"publicationDate":"2024-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140573121","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Stochastic three-term conjugate gradient method with variance technique for non-convex learning 用于非凸学习的随机三项共轭梯度法与方差技术
IF 2.2 2区 数学
Statistics and Computing Pub Date : 2024-03-27 DOI: 10.1007/s11222-024-10409-5
Chen Ouyang, Chenkaixiang Lu, Xiong Zhao, Ruping Huang, Gonglin Yuan, Yiyan Jiang
{"title":"Stochastic three-term conjugate gradient method with variance technique for non-convex learning","authors":"Chen Ouyang, Chenkaixiang Lu, Xiong Zhao, Ruping Huang, Gonglin Yuan, Yiyan Jiang","doi":"10.1007/s11222-024-10409-5","DOIUrl":"https://doi.org/10.1007/s11222-024-10409-5","url":null,"abstract":"<p>In the training process of machine learning, the minimization of the empirical risk loss function is often used to measure the difference between the model’s predicted value and the real value. Stochastic gradient descent is very popular for this type of optimization problem, but converges slowly in theoretical analysis. To solve this problem, there are already many algorithms with variance reduction techniques, such as SVRG, SAG, SAGA, etc. Some scholars apply the conjugate gradient method in traditional optimization to these algorithms, such as CGVR, SCGA, SCGN, etc., which can basically achieve linear convergence speed, but these conclusions often need to be established under some relatively strong assumptions. In traditional optimization, the conjugate gradient method often requires the use of line search techniques to achieve good experimental results. In a sense, line search embodies some properties of the conjugate methods. Taking inspiration from this, we apply the modified three-term conjugate gradient method and line search technique to machine learning. In our theoretical analysis, we obtain the same convergence rate as SCGA under weaker conditional assumptions. We also test the convergence of our algorithm using two non-convex machine learning models.</p>","PeriodicalId":22058,"journal":{"name":"Statistics and Computing","volume":"27 1","pages":""},"PeriodicalIF":2.2,"publicationDate":"2024-03-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140311638","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Novel sampling method for the von Mises–Fisher distribution von Mises-Fisher 分布的新型抽样方法
IF 2.2 2区 数学
Statistics and Computing Pub Date : 2024-03-26 DOI: 10.1007/s11222-024-10419-3
{"title":"Novel sampling method for the von Mises–Fisher distribution","authors":"","doi":"10.1007/s11222-024-10419-3","DOIUrl":"https://doi.org/10.1007/s11222-024-10419-3","url":null,"abstract":"<h3>Abstract</h3> <p>The von Mises–Fisher distribution is a widely used probability model in directional statistics. An algorithm for generating pseudo-random vectors from this distribution was suggested by Wood (Commun Stat Simul Comput 23(1):157–164, 1994), which is based on a rejection sampling scheme. This paper proposes an alternative to this rejection sampling approach for drawing pseudo-random vectors from arbitrary von Mises–Fisher distributions. A useful mixture representation is derived, which is a mixture of beta distributions with mixing weights that follow a confluent hypergeometric distribution. A condensed table-lookup method is adopted for sampling from the confluent hypergeometric distribution. A theoretical analysis investigates the amount of computation required to construct the condensed lookup table. Through numerical experiments, we demonstrate that the proposed algorithm outperforms the rejection-based method when generating a large number of pseudo-random vectors from the same distribution.</p>","PeriodicalId":22058,"journal":{"name":"Statistics and Computing","volume":"1 1","pages":""},"PeriodicalIF":2.2,"publicationDate":"2024-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140301529","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信