版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:Univ Toronto Dept Comp Sci Toronto ON M5S 3H5 Canada MIT Dept Brain & Cognit Sci Cambridge MA 02139 USA
出 版 物:《MACHINE LEARNING》 (机器学习)
年 卷 期:1997年第29卷第2-3期
页 面:245-273页
核心收录:
学科分类:08[工学] 0812[工学-计算机科学与技术(可授工学、理学学位)]
主 题:Hidden Markov models time series EM algorithm graphical models Bayesian networks mean field theory
摘 要:Hidden Markov models (HMMs) have proven to be one of the most widely used tools for learning probabilistic models of time series data. In an HMM, information about the past is conveyed through a single discrete variable-the hidden state. We discuss a generalization of HMMs in which this state is factor-ed into multiple state variables and is therefore represented in a distributed manner. We describe an exact algorithm far inferring the posterior probabilities of the hidden state variables given the observations, and relate it to the forward-backward algorithm for HMMs and to algorithms for more general graphical models. Doe to the combinatorial nature of the hidden state representation, this exact algorithm is intractable. As in other intractable systems, approximate inference can be carried out using Gibbs sampling or Variational methods. Within the variational framework, we present a structured approximation in which the the state variables are decoupled, yielding a tractable algorithm for learning the parameters of the model. Empirical comparisons suggest that these approximations are efficient and provide accurate alternatives to the exact methods. Finally, we use the structured approximation to model Bach s chorales and show that factorial HMMs can capture statistical structure in this data set which an unconstrained HMM cannot.