咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Global Convergence of the EM A... 收藏

Global Convergence of the EM Algorithm for Unconstrained Latent Variable Models with Categorical Indicators

为有范畴的指示物的非强迫的潜伏的可变模型的 EM 算法的全球集中

作     者:Weissman, Alexander 

作者机构:Law Sch Admiss Council Newtown PA 18940 USA 

出 版 物:《PSYCHOMETRIKA》 (心理测量学)

年 卷 期:2013年第78卷第1期

页      面:134-153页

核心收录:

学科分类:0402[教育学-心理学(可授教育学、理学学位)] 04[教育学] 0701[理学-数学] 

主  题:EM algorithm latent variable models latent class models information theory Kullback-Leibler divergence relative entropy variational calculus convex optimization optimal bounds 

摘      要:Convergence of the expectation-maximization (EM) algorithm to a global optimum of the marginal log likelihood function for unconstrained latent variable models with categorical indicators is presented. The sufficient conditions under which global convergence of the EM algorithm is attainable are provided in an information-theoretic context by interpreting the EM algorithm as alternating minimization of the Kullback-Leibler divergence between two convex sets. It is shown that these conditions are satisfied by an unconstrained latent class model, yielding an optimal bound against which more highly constrained models may be compared.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分