{"title":"Nonlinear and non-Gaussian signal processing","authors":"Max A. Little","doi":"10.1093/oso/9780198714934.003.0009","DOIUrl":null,"url":null,"abstract":"Linear, time-invariant (LTI) Gaussian DSP, has substantial mathematical conveniences that make it valuable in practical DSP applications and machine learning. When the signal really is generated by such an LTI-Gaussian model then this kind of processing is optimal from a statistical point of view. However, there are substantial limitations to the use of these techniques when we cannot guarantee that the assumptions of linearity, time-invariance and Gaussianity hold. In particular, signals that exhibit jumps or significant non-Gaussian outliers cause substantial adverse effects such as Gibb's phenomena in LTI filter outputs, and nonstationary signals cannot be compactly represented in the Fourier domain. In practice, many real signals show such phenomena to a greater or lesser degree, so it is important to have a `toolkit' of DSP methods that are effective in many situations. This chapter is dedicated to exploring the use of the statistical machine learning concepts in DSP.","PeriodicalId":73290,"journal":{"name":"IEEE International Workshop on Machine Learning for Signal Processing : [proceedings]. IEEE International Workshop on Machine Learning for Signal Processing","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2019-08-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE International Workshop on Machine Learning for Signal Processing : [proceedings]. IEEE International Workshop on Machine Learning for Signal Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1093/oso/9780198714934.003.0009","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Linear, time-invariant (LTI) Gaussian DSP, has substantial mathematical conveniences that make it valuable in practical DSP applications and machine learning. When the signal really is generated by such an LTI-Gaussian model then this kind of processing is optimal from a statistical point of view. However, there are substantial limitations to the use of these techniques when we cannot guarantee that the assumptions of linearity, time-invariance and Gaussianity hold. In particular, signals that exhibit jumps or significant non-Gaussian outliers cause substantial adverse effects such as Gibb's phenomena in LTI filter outputs, and nonstationary signals cannot be compactly represented in the Fourier domain. In practice, many real signals show such phenomena to a greater or lesser degree, so it is important to have a `toolkit' of DSP methods that are effective in many situations. This chapter is dedicated to exploring the use of the statistical machine learning concepts in DSP.