Vahid Rezaei Tabar, Hosna Fathipor, Horacio Pérez-Sánchez, F. Eskandari, D. Plewczyński
{"title":"混合正向和反向自回归隐马尔可夫模型的时间序列建模","authors":"Vahid Rezaei Tabar, Hosna Fathipor, Horacio Pérez-Sánchez, F. Eskandari, D. Plewczyński","doi":"10.29252/JIRSS.18.1.89","DOIUrl":null,"url":null,"abstract":". Hidden Markov models (HMM) are a ubiquitous tool for modeling time series data. The HMM can be poor at capturing dependency between observations because of the statistical assumptions it makes. Therefore, the extension of the HMM called forward-directed Autoregressive HMM (ARHMM) is considered to handle the dependencies between observations. It is also more appropriate to use an Autoregressive Hidden Markov Model directed backward in time. In this paper, we present a sequence-level mixture of these two forms of ARHMM (called MARHMM), e (cid:11) ectively allowing the model to choose for itself whether a forward-directed or backward-directed model or a soft combination of the two models are most appropriate for a given data set. For this purpose, we use the conditional independence relations in the context of a Bayesian network which is a probabilistic graphical model. The performance of the MARHMM is discussed by applying it to the simulated and real data sets. We show that the proposed model has greater modeling power than the conventional forward-directed ARHMM. The source code is available at https: // bitbucket.org 4dnucleome .","PeriodicalId":42965,"journal":{"name":"JIRSS-Journal of the Iranian Statistical Society","volume":" ","pages":""},"PeriodicalIF":0.1000,"publicationDate":"2019-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Mixture of Forward-Directed and Backward-Directed Autoregressive Hidden Markov Models for Time series Modeling\",\"authors\":\"Vahid Rezaei Tabar, Hosna Fathipor, Horacio Pérez-Sánchez, F. Eskandari, D. Plewczyński\",\"doi\":\"10.29252/JIRSS.18.1.89\",\"DOIUrl\":null,\"url\":null,\"abstract\":\". Hidden Markov models (HMM) are a ubiquitous tool for modeling time series data. The HMM can be poor at capturing dependency between observations because of the statistical assumptions it makes. Therefore, the extension of the HMM called forward-directed Autoregressive HMM (ARHMM) is considered to handle the dependencies between observations. It is also more appropriate to use an Autoregressive Hidden Markov Model directed backward in time. In this paper, we present a sequence-level mixture of these two forms of ARHMM (called MARHMM), e (cid:11) ectively allowing the model to choose for itself whether a forward-directed or backward-directed model or a soft combination of the two models are most appropriate for a given data set. For this purpose, we use the conditional independence relations in the context of a Bayesian network which is a probabilistic graphical model. The performance of the MARHMM is discussed by applying it to the simulated and real data sets. We show that the proposed model has greater modeling power than the conventional forward-directed ARHMM. The source code is available at https: // bitbucket.org 4dnucleome .\",\"PeriodicalId\":42965,\"journal\":{\"name\":\"JIRSS-Journal of the Iranian Statistical Society\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":0.1000,\"publicationDate\":\"2019-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"JIRSS-Journal of the Iranian Statistical Society\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.29252/JIRSS.18.1.89\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"STATISTICS & PROBABILITY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"JIRSS-Journal of the Iranian Statistical Society","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.29252/JIRSS.18.1.89","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
Mixture of Forward-Directed and Backward-Directed Autoregressive Hidden Markov Models for Time series Modeling
. Hidden Markov models (HMM) are a ubiquitous tool for modeling time series data. The HMM can be poor at capturing dependency between observations because of the statistical assumptions it makes. Therefore, the extension of the HMM called forward-directed Autoregressive HMM (ARHMM) is considered to handle the dependencies between observations. It is also more appropriate to use an Autoregressive Hidden Markov Model directed backward in time. In this paper, we present a sequence-level mixture of these two forms of ARHMM (called MARHMM), e (cid:11) ectively allowing the model to choose for itself whether a forward-directed or backward-directed model or a soft combination of the two models are most appropriate for a given data set. For this purpose, we use the conditional independence relations in the context of a Bayesian network which is a probabilistic graphical model. The performance of the MARHMM is discussed by applying it to the simulated and real data sets. We show that the proposed model has greater modeling power than the conventional forward-directed ARHMM. The source code is available at https: // bitbucket.org 4dnucleome .