{"title":"Research Spotlights","authors":"Stefan M. Wild","doi":"10.1137/24n975839","DOIUrl":null,"url":null,"abstract":"SIAM Review, Volume 66, Issue 1, Page 89-89, February 2024. <br/> As modeling, simulation, and data-driven capabilities continue to advance and be adopted for an ever expanding set of applications and downstream tasks, there has been an increased need for quantifying the uncertainty in the resulting predictions. In “Easy Uncertainty Quantification (EasyUQ): Generating Predictive Distributions from Single-Valued Model Output,” authors Eva-Maria Walz, Alexander Henzi, Johanna Ziegel, and Tilmann Gneiting provide a methodology for moving beyond deterministic scalar-valued predictions to obtain particular statistical distributions for these predictions. The approach relies on training data of model output-observation pairs of scalars, and hence does not require access to higher-dimensional inputs or latent variables. The authors use numerical weather prediction as a particular example, where one can obtain repeated forecasts, and corresponding observations, of temperatures at a specific location. Given a predicted temperature, the EasyUQ approach provides a nonparametric distribution of temperatures around this value. EasyUQ uses the training data to effectively minimize an empirical score subject to a stochastic monotonicity constraint, which ensures that the predictive distribution values become larger as the model output value grows. In doing so, the approach inherits the theoretical properties of optimality and consistency enjoyed by so-called isotonic distributional regression methods. The authors emphasize that the basic version of EasyUQ does not require elaborate hyperparameter tuning. They also introduce a more sophisticated version that relies on kernel smoothing to yield predictive probability densities while preserving key properties of the basic version. The paper demonstrates how EasyUQ compares with the standard technique of applying a Gaussian error distribution to a deterministic forecast as well as how EasyUQ can be used to obtain uncertainty estimates for artificial neural network outputs. The approach will be especially of interest for settings when inputs or other latent variables are unreliable or unavailable since it offers a straightforward yet statistically principled and computationally efficient way for working only with outputs and observations.","PeriodicalId":49525,"journal":{"name":"SIAM Review","volume":"5 1","pages":""},"PeriodicalIF":10.8000,"publicationDate":"2024-02-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIAM Review","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1137/24n975839","RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
SIAM Review, Volume 66, Issue 1, Page 89-89, February 2024. As modeling, simulation, and data-driven capabilities continue to advance and be adopted for an ever expanding set of applications and downstream tasks, there has been an increased need for quantifying the uncertainty in the resulting predictions. In “Easy Uncertainty Quantification (EasyUQ): Generating Predictive Distributions from Single-Valued Model Output,” authors Eva-Maria Walz, Alexander Henzi, Johanna Ziegel, and Tilmann Gneiting provide a methodology for moving beyond deterministic scalar-valued predictions to obtain particular statistical distributions for these predictions. The approach relies on training data of model output-observation pairs of scalars, and hence does not require access to higher-dimensional inputs or latent variables. The authors use numerical weather prediction as a particular example, where one can obtain repeated forecasts, and corresponding observations, of temperatures at a specific location. Given a predicted temperature, the EasyUQ approach provides a nonparametric distribution of temperatures around this value. EasyUQ uses the training data to effectively minimize an empirical score subject to a stochastic monotonicity constraint, which ensures that the predictive distribution values become larger as the model output value grows. In doing so, the approach inherits the theoretical properties of optimality and consistency enjoyed by so-called isotonic distributional regression methods. The authors emphasize that the basic version of EasyUQ does not require elaborate hyperparameter tuning. They also introduce a more sophisticated version that relies on kernel smoothing to yield predictive probability densities while preserving key properties of the basic version. The paper demonstrates how EasyUQ compares with the standard technique of applying a Gaussian error distribution to a deterministic forecast as well as how EasyUQ can be used to obtain uncertainty estimates for artificial neural network outputs. The approach will be especially of interest for settings when inputs or other latent variables are unreliable or unavailable since it offers a straightforward yet statistically principled and computationally efficient way for working only with outputs and observations.
期刊介绍:
Survey and Review feature papers that provide an integrative and current viewpoint on important topics in applied or computational mathematics and scientific computing. These papers aim to offer a comprehensive perspective on the subject matter.
Research Spotlights publish concise research papers in applied and computational mathematics that are of interest to a wide range of readers in SIAM Review. The papers in this section present innovative ideas that are clearly explained and motivated. They stand out from regular publications in specific SIAM journals due to their accessibility and potential for widespread and long-lasting influence.