{"title":"Measurability and continuity of parametric low-rank approximation in Hilbert spaces: linear operators and random variables","authors":"Nicola Rares Franco","doi":"arxiv-2409.09102","DOIUrl":null,"url":null,"abstract":"We develop a unified theoretical framework for low-rank approximation\ntechniques in parametric settings, where traditional methods like Singular\nValue Decomposition (SVD), Proper Orthogonal Decomposition (POD), and Principal\nComponent Analysis (PCA) face significant challenges due to repeated queries.\nApplications include, e.g., the numerical treatment of parameter-dependent\npartial differential equations (PDEs), where operators vary with parameters,\nand the statistical analysis of longitudinal data, where complex measurements\nlike audio signals and images are collected over time. Although the applied\nliterature has introduced partial solutions through adaptive algorithms, these\nadvancements lack a comprehensive mathematical foundation. As a result, key\ntheoretical questions -- such as the existence and parametric regularity of\noptimal low-rank approximants -- remain inadequately addressed. Our goal is to\nbridge this gap between theory and practice by establishing a rigorous\nframework for parametric low-rank approximation under minimal assumptions,\nspecifically focusing on cases where parameterizations are either measurable or\ncontinuous. The analysis is carried out within the context of separable Hilbert\nspaces, ensuring applicability to both finite and infinite-dimensional\nsettings. Finally, connections to recently emerging trends in the Deep Learning\nliterature, relevant for engineering and data science, are also discussed.","PeriodicalId":501162,"journal":{"name":"arXiv - MATH - Numerical Analysis","volume":"25 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - MATH - Numerical Analysis","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.09102","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
We develop a unified theoretical framework for low-rank approximation
techniques in parametric settings, where traditional methods like Singular
Value Decomposition (SVD), Proper Orthogonal Decomposition (POD), and Principal
Component Analysis (PCA) face significant challenges due to repeated queries.
Applications include, e.g., the numerical treatment of parameter-dependent
partial differential equations (PDEs), where operators vary with parameters,
and the statistical analysis of longitudinal data, where complex measurements
like audio signals and images are collected over time. Although the applied
literature has introduced partial solutions through adaptive algorithms, these
advancements lack a comprehensive mathematical foundation. As a result, key
theoretical questions -- such as the existence and parametric regularity of
optimal low-rank approximants -- remain inadequately addressed. Our goal is to
bridge this gap between theory and practice by establishing a rigorous
framework for parametric low-rank approximation under minimal assumptions,
specifically focusing on cases where parameterizations are either measurable or
continuous. The analysis is carried out within the context of separable Hilbert
spaces, ensuring applicability to both finite and infinite-dimensional
settings. Finally, connections to recently emerging trends in the Deep Learning
literature, relevant for engineering and data science, are also discussed.