{"title":"Sequential Variational Learning of Dynamic Factor Mixtures","authors":"A. Samé","doi":"10.1109/ICDMW.2018.00105","DOIUrl":null,"url":null,"abstract":"The clustering of panel data remains a challenging problem, considering their dynamic and potentially massive nature. The massive aspect of panel data can be related to their number of observations and/or their high dimensionality. In this article, a new model and its estimation method are initiated to tackle these problems. The proposed model is a mixture distribution whose components are dynamic factor analyzers. The model inference, which cannot be performed exactly by classical methods, is realized in the sequential variational framework. In particular, it is established that the proposed algorithm converges in the sense of stochastic gradient algorithms toward an average lower variational bound. Experiments conducted on simulated data illustrate the good practical behavior of the method.","PeriodicalId":259600,"journal":{"name":"2018 IEEE International Conference on Data Mining Workshops (ICDMW)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE International Conference on Data Mining Workshops (ICDMW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDMW.2018.00105","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The clustering of panel data remains a challenging problem, considering their dynamic and potentially massive nature. The massive aspect of panel data can be related to their number of observations and/or their high dimensionality. In this article, a new model and its estimation method are initiated to tackle these problems. The proposed model is a mixture distribution whose components are dynamic factor analyzers. The model inference, which cannot be performed exactly by classical methods, is realized in the sequential variational framework. In particular, it is established that the proposed algorithm converges in the sense of stochastic gradient algorithms toward an average lower variational bound. Experiments conducted on simulated data illustrate the good practical behavior of the method.