XML Persian Abstract Print


AllamehTabataba'I University
Abstract:   (197 Views)

‎Hidden Markov models (HMM) are a ubiquitous tool for modeling time series data‎. ‎The HMM can be poor at capturing dependency between observations because of the statistical assumptions it makes‎. ‎Therefore‎, ‎the extension of the HMM called forward-directed Autoregressive HMM (ARHMM) is considered to handle the dependencies between observations‎. ‎It is also more appropriate to use an Autoregressive Hidden Markov Model directed backward in time‎. ‎ ‎In this paper‎, ‎we present a sequence-level mixture of these two forms of ARHMM (called MARHMM)‎, ‎effectively allowing the model to choose for itself whether a forward-directed or backward-directed model or a soft combination of the two models is more appropriate for a given data set‎. ‎For this purpose‎, ‎we use the conditional independence relations in the context of Bayesian network which is a probabilistic graphical model‎. ‎The performance of the MARHMM is discussed by applying it to the simulated and real data sets‎. ‎We show that the proposed model has greater modeling power than the  conventional forward-directed ARHMM‎. The source code is available at https://bitbucket.org/4dnucleome/marhmm/

     
Type of Study: Original Paper | Subject: 60Jxx: Markov processes
Received: 2018/05/22 | Accepted: 2018/09/2 | Published: 2018/09/2

Add your comments about this article : Your username or Email:
CAPTCHA code

Send email to the article author


© 2015 All Rights Reserved | Journal of The Iranian Statistical Society