SIAM ReviewPub Date : 2025-05-08DOI: 10.1137/24m1702611
Hollis Williams, Azza M. Algatheem
{"title":"Book Review:; Algorithmic Mathematics in Machine Learning","authors":"Hollis Williams, Azza M. Algatheem","doi":"10.1137/24m1702611","DOIUrl":"https://doi.org/10.1137/24m1702611","url":null,"abstract":"SIAM Review, Volume 67, Issue 2, Page 406-408, May 2025. <br/> The 2024 Nobel Prize in Physics was awarded to John Hopfield and Geoffrey Hinton for their work on artificial intelligence and machine learning. The award has been somewhat controversial in the physics community and prompted some heated debates, since the only apparent use of physics is the Boltzmann distribution in the sampling function of the Boltzmann machine [D. H. Ackley, G. E. Hinton, and T. J. Sejnowski, Cog. Sci., 9 (1985), pp. 147–169]. If we leave aside this debate for the time being, it is undeniable that artificial intelligence and machine learning have had a transformative effect on various areas of science and technology.","PeriodicalId":49525,"journal":{"name":"SIAM Review","volume":"25 1","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-05-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143920677","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
SIAM ReviewPub Date : 2025-05-08DOI: 10.1137/23m1616716
Ho Yun, Victor M. Panaretos
{"title":"Computerized Tomography and Reproducing Kernels","authors":"Ho Yun, Victor M. Panaretos","doi":"10.1137/23m1616716","DOIUrl":"https://doi.org/10.1137/23m1616716","url":null,"abstract":"SIAM Review, Volume 67, Issue 2, Page 321-350, May 2025. <br/> Abstract.The X-ray transform is one of the most fundamental integral operators in image processing and reconstruction. In this paper, we revisit the formalism of the X-ray transform by considering it as an operator between reproducing kernel Hilbert spaces (RKHSs). Within this framework, the X-ray transform can be viewed as a natural analogue of Euclidean projection. The RKHS framework considerably simplifies projection image interpolation, and it leads to an analogue of the celebrated representer theorem for the problem of tomographic reconstruction. It leads to methodology that is dimension-free and stands apart from conventional filtered backprojection techniques, as it does not hinge on the Fourier transform. It also allows us to establish sharp stability results at a genuinely functional level (i.e., without recourse to discretization), but in the realistic setting where the data are discrete and noisy. The RKHS framework is versatile, accommodating any reproducing kernel on a unit ball, affording a high level of generality. When the kernel is chosen to be rotation-invariant, explicit spectral representations can be obtained, elucidating the regularity structure of the associated Hilbert spaces. Moreover, the reconstruction problem can be solved at the same computational cost as filtered backprojection.","PeriodicalId":49525,"journal":{"name":"SIAM Review","volume":"37 1","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-05-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143920690","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
SIAM ReviewPub Date : 2025-05-08DOI: 10.1137/23m1578371
Ben Tu, Nikolas Kantas, Robert M. Lee, Behrang Shafei
{"title":"Multiobjective Optimization Using the R2 Utility","authors":"Ben Tu, Nikolas Kantas, Robert M. Lee, Behrang Shafei","doi":"10.1137/23m1578371","DOIUrl":"https://doi.org/10.1137/23m1578371","url":null,"abstract":"SIAM Review, Volume 67, Issue 2, Page 213-255, May 2025. <br/> Abstract.The goal of multiobjective optimization is to identify a collection of points which describe the best possible trade-offs among the multiple objectives. In order to solve this vector-valued optimization problem, practitioners often appeal to the use of scalarization functions in order to transform the multiobjective problem into a collection of single-objective problems. This set of scalarized problems can then be solved using traditional single-objective optimization techniques. In this paper, we formalize this convention into a general mathematical framework. We show how this strategy effectively recasts the original multiobjective optimization problem into a single-objective optimization problem defined over sets. An appropriate class of objective functions for this new problem is that of the R2 utilities, which are utility functions that are defined as a weighted integral over the scalarized optimization problem. As part of our work, we show that these utilities are monotone and submodular set functions that can be optimized effectively using greedy optimization algorithms. We then analyze the performance of these greedy algorithms both theoretically and empirically. Our analysis largely focuses on Bayesian optimization, which is a popular probabilistic framework for black-box optimization.","PeriodicalId":49525,"journal":{"name":"SIAM Review","volume":"72 1","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-05-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143927310","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
SIAM ReviewPub Date : 2025-02-06DOI: 10.1137/24m167562x
David Banks
{"title":"Book Review:; Essential Statistics for Data Science: A Concise Crash Course","authors":"David Banks","doi":"10.1137/24m167562x","DOIUrl":"https://doi.org/10.1137/24m167562x","url":null,"abstract":"SIAM Review, Volume 67, Issue 1, Page 206-207, March 2025. <br/> This is a bold book! Professor Zhu wants to provide the basic statistical knowledge needed by data scientists in a super-short volume. It reminds me a bit of Larry Wasserman’s All of Statistics (Springer, 2014), but is aimed at Masters students (often from fields other than statistics) or advanced undergraduates (also often from other fields). As an attendee at far too many faculty meetings, I applaud brevity and focus. As an amateur stylist, I admire strong technical writing. And as an applied statistician who has taught basic statistics to Masters and Ph.D. students from other disciplines, I appreciate the need for a book of this kind. For the right course I would happily use this book, although I would need to supplement it with other material.","PeriodicalId":49525,"journal":{"name":"SIAM Review","volume":"79 1","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-02-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143258247","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
SIAM ReviewPub Date : 2025-02-06DOI: 10.1137/23m1568739
Nina M. Gottschling, Vegard Antun, Anders C. Hansen, Ben Adcock
{"title":"The Troublesome Kernel: On Hallucinations, No Free Lunches, and the Accuracy-Stability Tradeoff in Inverse Problems","authors":"Nina M. Gottschling, Vegard Antun, Anders C. Hansen, Ben Adcock","doi":"10.1137/23m1568739","DOIUrl":"https://doi.org/10.1137/23m1568739","url":null,"abstract":"SIAM Review, Volume 67, Issue 1, Page 73-104, March 2025. <br/> Abstract.Methods inspired by artificial intelligence (AI) are starting to fundamentally change computational science and engineering through breakthrough performance on challenging problems. However, the reliability and trustworthiness of such techniques is a major concern. In inverse problems in imaging, the focus of this paper, there is increasing empirical evidence that methods may suffer from hallucinations, i.e., false, but realistic-looking artifacts; instability, i.e., sensitivity to perturbations in the data; and unpredictable generalization, i.e., excellent performance on some images, but significant deterioration on others. This paper provides a theoretical foundation for these phenomena. We give mathematical explanations for how and when such effects arise in arbitrary reconstruction methods, with several of our results taking the form of “no free lunch” theorems. Specifically, we show that (i) methods that overperform on a single image can wrongly transfer details from one image to another, creating a hallucination; (ii) methods that overperform on two or more images can hallucinate or be unstable; (iii) optimizing the accuracy-stability tradeoff is generally difficult; (iv) hallucinations and instabilities, if they occur, are not rare events and may be encouraged by standard training; and (v) it may be impossible to construct optimal reconstruction maps for certain problems. Our results trace these effects to the kernel of the forward operator whenever it is nontrivial, but also apply to the case when the forward operator is ill-conditioned. Based on these insights, our work aims to spur research into new ways to develop robust and reliable AI-based methods for inverse problems in imaging.","PeriodicalId":49525,"journal":{"name":"SIAM Review","volume":"123 1","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-02-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143258252","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
SIAM ReviewPub Date : 2025-02-06DOI: 10.1137/24m1650466
Gabriele Ciaramella
{"title":"Book Review:; Numerical Methods in Physics with Python. Second Edition","authors":"Gabriele Ciaramella","doi":"10.1137/24m1650466","DOIUrl":"https://doi.org/10.1137/24m1650466","url":null,"abstract":"SIAM Review, Volume 67, Issue 1, Page 204-205, March 2025. <br/> Numerical Methods in Physics with Python by Alex Gezerlis is an excellent example of a textbook built on long and established teaching experience. The goals are clearly defined in the preface: Gezerlis aims to gently introduce undergraduate physics students to the branch of numerical methods and their concrete implementation in Python. To this end, the author considers a physics-applications-first approach. Every chapter begins with a motivation section on real physics problems (simple but adequate for undergraduate students), ends with a concrete project on a physics application, and is completed by a rich list of exercises often designed with a physics appeal.","PeriodicalId":49525,"journal":{"name":"SIAM Review","volume":"140 1","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-02-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143258249","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
SIAM ReviewPub Date : 2025-02-06DOI: 10.1137/24m1696974
Tyrus Berry, Suddhasattwa Das
{"title":"Limits of Learning Dynamical Systems","authors":"Tyrus Berry, Suddhasattwa Das","doi":"10.1137/24m1696974","DOIUrl":"https://doi.org/10.1137/24m1696974","url":null,"abstract":"SIAM Review, Volume 67, Issue 1, Page 107-137, March 2025. <br/> Abstract.A dynamical system is a transformation of a phase space, and the transformation law is the primary means of defining as well as identifying the dynamical system and is the object of focus of many learning techniques. However, there are many secondary aspects of dynamical systems—invariant sets, the Koopman operator, and Markov approximations—that provide alternative objectives for learning techniques. Crucially, while many learning methods are focused on the transformation law, we find that forecast performance can depend on how well these other aspects of the dynamics are approximated. These different facets of a dynamical system correspond to objects in completely different spaces—namely, interpolation spaces, compact Hausdorff sets, unitary operators, and Markov operators, respectively. Thus, learning techniques targeting any of these four facets perform different kinds of approximations. We examine whether an approximation of any one of these aspects of the dynamics could lead to an approximation of another facet. Many connections and obstructions are brought to light in this analysis. Special focus is placed on methods of learning the primary feature—the dynamics law itself. The main question considered is the connection between learning this law and reconstructing the Koopman operator and the invariant set. The answers are tied to the ergodic and topological properties of the dynamics, and they reveal how these properties determine the limits of forecasting techniques.","PeriodicalId":49525,"journal":{"name":"SIAM Review","volume":"47 1","pages":""},"PeriodicalIF":10.2,"publicationDate":"2025-02-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143258457","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}