{"title":"基于方差分析分解的稀疏混合模型","authors":"J. Hertrich, F. Ba, G. Steidl","doi":"10.1553/etna_vol55s142","DOIUrl":null,"url":null,"abstract":"Inspired by the analysis of variance (ANOVA) decomposition of functions, we propose a Gaussianuniform mixture model on the high-dimensional torus which relies on the assumption that the function that we wish to approximate can be well explained by limited variable interactions. We consider three model approaches, namely wrapped Gaussians, diagonal wrapped Gaussians, and products of von Mises distributions. The sparsity of the mixture model is ensured by the fact that its summands are products of Gaussian-like density functions acting on low-dimensional spaces and uniform probability densities defined on the remaining directions. To learn such a sparse mixture model from given samples, we propose an objective function consisting of the negative log-likelihood function of the mixture model and a regularizer that penalizes the number of its summands. For minimizing this functional we combine the Expectation Maximization algorithm with a proximal step that takes the regularizer into account. To decide which summands of the mixture model are important, we apply a Kolmogorov-Smirnov test. Numerical examples demonstrate the performance of our approach.","PeriodicalId":282695,"journal":{"name":"ETNA - Electronic Transactions on Numerical Analysis","volume":"41 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-05-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Sparse mixture models inspired by ANOVA decompositions\",\"authors\":\"J. Hertrich, F. Ba, G. Steidl\",\"doi\":\"10.1553/etna_vol55s142\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Inspired by the analysis of variance (ANOVA) decomposition of functions, we propose a Gaussianuniform mixture model on the high-dimensional torus which relies on the assumption that the function that we wish to approximate can be well explained by limited variable interactions. We consider three model approaches, namely wrapped Gaussians, diagonal wrapped Gaussians, and products of von Mises distributions. The sparsity of the mixture model is ensured by the fact that its summands are products of Gaussian-like density functions acting on low-dimensional spaces and uniform probability densities defined on the remaining directions. To learn such a sparse mixture model from given samples, we propose an objective function consisting of the negative log-likelihood function of the mixture model and a regularizer that penalizes the number of its summands. For minimizing this functional we combine the Expectation Maximization algorithm with a proximal step that takes the regularizer into account. To decide which summands of the mixture model are important, we apply a Kolmogorov-Smirnov test. Numerical examples demonstrate the performance of our approach.\",\"PeriodicalId\":282695,\"journal\":{\"name\":\"ETNA - Electronic Transactions on Numerical Analysis\",\"volume\":\"41 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-05-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ETNA - Electronic Transactions on Numerical Analysis\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1553/etna_vol55s142\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ETNA - Electronic Transactions on Numerical Analysis","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1553/etna_vol55s142","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Sparse mixture models inspired by ANOVA decompositions
Inspired by the analysis of variance (ANOVA) decomposition of functions, we propose a Gaussianuniform mixture model on the high-dimensional torus which relies on the assumption that the function that we wish to approximate can be well explained by limited variable interactions. We consider three model approaches, namely wrapped Gaussians, diagonal wrapped Gaussians, and products of von Mises distributions. The sparsity of the mixture model is ensured by the fact that its summands are products of Gaussian-like density functions acting on low-dimensional spaces and uniform probability densities defined on the remaining directions. To learn such a sparse mixture model from given samples, we propose an objective function consisting of the negative log-likelihood function of the mixture model and a regularizer that penalizes the number of its summands. For minimizing this functional we combine the Expectation Maximization algorithm with a proximal step that takes the regularizer into account. To decide which summands of the mixture model are important, we apply a Kolmogorov-Smirnov test. Numerical examples demonstrate the performance of our approach.