TY - JOUR
T1 - Generalized Baum-Welch and Viterbi Algorithms Based on the Direct Dependency among Observations
TT - تعمیم الگوریتمهای بام-ولش و ویتربی بر اساس وابستگی مستقیم بین مشاهدات
JF - JIRSS
JO - JIRSS
VL - 18
IS - 1
UR - http://jirss.irstat.ir/article-1-434-en.html
Y1 - 2017
SP - 0
EP - 0
KW - Baum-Welch Algorithm
KW - Bayesian Network
KW - Hidden Markov Model
KW - Viterbi Algorithmlgorithm
N2 - The parameters of a Hidden Markov Model (HMM) are transition and emission probabilities. Both can be estimated using the Baum-Welch algorithm. The process of discovering the sequence of hidden states, given the sequence of observations, is performed by the Viterbi algorithm. In both Baum-Welch and Viterbi algorithms, it is assumed that given the states, the observations are independent from each other. In this paper, we first consider the direct dependency between consecutive observations in HMM, and then use conditional independence relations in the context of a Bayesian network which is a probabilistic graphical model for generalizing the Baum-Welch and Viterbi algorithms. We compare the performance of the generalized algorithms with the commonly used ones in simulation studies for synthetic data. We finally apply these algorithms on real datasets which are related to biological and inflation data. We show that the generalized Baum-Welch and Viterbi algorithms significantly outperform the conventional ones when the sample sizes become larger.
M3
ER -