{"title":"Robust signal recovery in Hadamard spaces","authors":"Georg Köstenberger, Thomas Stark","doi":"10.1016/j.jmva.2025.105469","DOIUrl":null,"url":null,"abstract":"<div><div>We analyze the stability of (strong) laws of large numbers in Hadamard spaces with respect to distributional perturbations. For the inductive means of a sequence of independent but not necessarily identically distributed random variables, we provide a concentration inequality in quadratic mean and a strong law of large numbers, generalizing a classical result of K.-T. Sturm. For the Fréchet mean, we generalize H. Ziezold’s law of large numbers in Hadamard spaces. In this case, we neither require our data to be independent nor identically distributed; reasonably mild conditions on the first two moments of our sample are enough. Additionally, we look at data contamination via a model inspired by Huber’s <span><math><mi>ɛ</mi></math></span>-contamination model, in which we replace a random portion of the data with noise. In the most general setup, we neither require the data nor the noise to be i.i.d., nor do we require the noise to be independent of the data. A resampling scheme is introduced to analyze the stability of the (non-symmetric) inductive mean with respect to data loss, data permutation, and noise, and sufficient conditions for its convergence are provided. These results suggest that means in Hadamard spaces are as robust as those in Euclidean spaces. This is underlined by a small simulation study in which we compare the robustness of means on the manifold of positive definite matrices with means on open books.</div></div>","PeriodicalId":16431,"journal":{"name":"Journal of Multivariate Analysis","volume":"210 ","pages":"Article 105469"},"PeriodicalIF":1.4000,"publicationDate":"2025-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Multivariate Analysis","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0047259X25000648","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
引用次数: 0
Abstract
We analyze the stability of (strong) laws of large numbers in Hadamard spaces with respect to distributional perturbations. For the inductive means of a sequence of independent but not necessarily identically distributed random variables, we provide a concentration inequality in quadratic mean and a strong law of large numbers, generalizing a classical result of K.-T. Sturm. For the Fréchet mean, we generalize H. Ziezold’s law of large numbers in Hadamard spaces. In this case, we neither require our data to be independent nor identically distributed; reasonably mild conditions on the first two moments of our sample are enough. Additionally, we look at data contamination via a model inspired by Huber’s -contamination model, in which we replace a random portion of the data with noise. In the most general setup, we neither require the data nor the noise to be i.i.d., nor do we require the noise to be independent of the data. A resampling scheme is introduced to analyze the stability of the (non-symmetric) inductive mean with respect to data loss, data permutation, and noise, and sufficient conditions for its convergence are provided. These results suggest that means in Hadamard spaces are as robust as those in Euclidean spaces. This is underlined by a small simulation study in which we compare the robustness of means on the manifold of positive definite matrices with means on open books.
期刊介绍:
Founded in 1971, the Journal of Multivariate Analysis (JMVA) is the central venue for the publication of new, relevant methodology and particularly innovative applications pertaining to the analysis and interpretation of multidimensional data.
The journal welcomes contributions to all aspects of multivariate data analysis and modeling, including cluster analysis, discriminant analysis, factor analysis, and multidimensional continuous or discrete distribution theory. Topics of current interest include, but are not limited to, inferential aspects of
Copula modeling
Functional data analysis
Graphical modeling
High-dimensional data analysis
Image analysis
Multivariate extreme-value theory
Sparse modeling
Spatial statistics.