Volume 18, Issue 1 (6-2019)                   JIRSS 2019, 18(1): 89-112 | Back to browse issues page


XML Persian Abstract Print


Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Rezaei Tabar V, Fathipor H, Pérez-Sánchez H, Eskandari F, Plewczynski D. Mixture of Forward-Directed and Backward-Directed Autoregressive Hidden Markov Models for Time Series Modeling. JIRSS. 2019; 18 (1) :89-112
URL: http://jirss.irstat.ir/article-1-523-en.html
Department of Statistics‎, ‎Faculty of mathematics and Computer Sciences‎, ‎Allameh Tabataba'i University‎, ‎Tehran‎, ‎Iran‎. , vhrezaei@gmail.com
Abstract:   (1190 Views)

Hidden Markov models (HMM) are a ubiquitous tool for modeling time series data. The HMM can be poor at capturing dependency between observations because of the statistical assumptions it makes. Therefore, the extension of the HMM called forward-directed Autoregressive HMM (ARHMM) is considered to handle the dependencies between observations. It is also  more appropriate to use an Autoregressive Hidden Markov Model directed backward in time. In this paper, we present a sequence-level mixture of these two forms of ARHMM (called MARHMM), effectively allowing the model to choose for itself whether a forward-directed or backward-directed model or a soft combination of the two models are most appropriate for a given data set. For this purpose, we use the conditional independence relations in the context of a Bayesian network which is a probabilistic graphical model. The performance of the MARHMM is discussed by applying it to the simulated and real data sets. We show that the proposed model has greater modeling power than the conventional forward-directed ARHMM.

Full-Text [PDF 279 kb]   (154 Downloads)    
Type of Study: Original Paper | Subject: 60Jxx: Markov processes
Received: 2018/05/22 | Accepted: 2018/09/2 | Published: 2018/09/2

Send email to the article author


© 2015 All Rights Reserved | Journal of The Iranian Statistical Society