{"title":"Structure-Exploiting variational inference for recurrent switching linear dynamical systems","authors":"Scott W. Linderman, Matthew J. Johnson","doi":"10.1109/CAMSAP.2017.8313132","DOIUrl":null,"url":null,"abstract":"Many natural systems, such as neurons firing in the brain or basketball teams traversing a court, give rise to time series data with complex, nonlinear dynamics. We can gain insight into these systems by decomposing the data into segments that are each explained by simpler dynamic units. This is the motivation underlying the class of recurrent switching linear dynamical systems (rSLDS) [1], which build on the standard SLDS by introducing a model of how discrete transition probabilities depend on observations or continuous latent states. Previous work relied on Markov chain Monte Carlo algorithms and augmentation schemes for inference, but these methods only applied to a limited class of recurrent dependencies. Here we relax these constraints and consider recurrent dependencies specified by arbitrary parametric, nonlinear functions. We derive two structure-exploiting variational inference algorithms for these challenging models. Both leverage the conditionally linear Gaussian and Markovian nature of the models to perform efficient posterior inference.","PeriodicalId":315977,"journal":{"name":"2017 IEEE 7th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE 7th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CAMSAP.2017.8313132","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7
Abstract
Many natural systems, such as neurons firing in the brain or basketball teams traversing a court, give rise to time series data with complex, nonlinear dynamics. We can gain insight into these systems by decomposing the data into segments that are each explained by simpler dynamic units. This is the motivation underlying the class of recurrent switching linear dynamical systems (rSLDS) [1], which build on the standard SLDS by introducing a model of how discrete transition probabilities depend on observations or continuous latent states. Previous work relied on Markov chain Monte Carlo algorithms and augmentation schemes for inference, but these methods only applied to a limited class of recurrent dependencies. Here we relax these constraints and consider recurrent dependencies specified by arbitrary parametric, nonlinear functions. We derive two structure-exploiting variational inference algorithms for these challenging models. Both leverage the conditionally linear Gaussian and Markovian nature of the models to perform efficient posterior inference.