{"title":"Nonparametric Bayesian machine learning and signal processing","authors":"Max A. Little","doi":"10.1093/oso/9780198714934.003.0010","DOIUrl":null,"url":null,"abstract":"We have seen that stochastic processes play an important foundational role in a wide range of methods in DSP. For example, we treat a discrete-time signal as a Gaussian process, and thereby obtain many mathematically simplified algorithms, particularly based on the power spectral density. At the same time, in machine learning, it has generally been observed that nonparametric methods outperform parametric methods in terms of predictive accuracy since they can adapt to data with arbitrary complexity. However, these techniques are not Bayesian so we are unable to do important inferential procedures such as draw samples from the underlying probabilistic model or compute posterior confidence intervals. But, Bayesian models are often only mathematically tractable if parametric, with the corresponding loss of predictive accuracy. An alternative, discussed in this section, is to extend the mathematical tractability of stochastic processes to Bayesian methods. This leads to so-called Bayesian nonparametrics exemplified by techniques such as Gaussian process regression and Dirichlet process mixture modelling that have been shown to be extremely useful in practical DSP and machine learning applications.","PeriodicalId":73290,"journal":{"name":"IEEE International Workshop on Machine Learning for Signal Processing : [proceedings]. IEEE International Workshop on Machine Learning for Signal Processing","volume":"46 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2019-08-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE International Workshop on Machine Learning for Signal Processing : [proceedings]. IEEE International Workshop on Machine Learning for Signal Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1093/oso/9780198714934.003.0010","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
We have seen that stochastic processes play an important foundational role in a wide range of methods in DSP. For example, we treat a discrete-time signal as a Gaussian process, and thereby obtain many mathematically simplified algorithms, particularly based on the power spectral density. At the same time, in machine learning, it has generally been observed that nonparametric methods outperform parametric methods in terms of predictive accuracy since they can adapt to data with arbitrary complexity. However, these techniques are not Bayesian so we are unable to do important inferential procedures such as draw samples from the underlying probabilistic model or compute posterior confidence intervals. But, Bayesian models are often only mathematically tractable if parametric, with the corresponding loss of predictive accuracy. An alternative, discussed in this section, is to extend the mathematical tractability of stochastic processes to Bayesian methods. This leads to so-called Bayesian nonparametrics exemplified by techniques such as Gaussian process regression and Dirichlet process mixture modelling that have been shown to be extremely useful in practical DSP and machine learning applications.