咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >The one-step-late PXEM algorit... 收藏

The one-step-late PXEM algorithm

one-step-late PXEM 算法

作     者:Van Dyk, DA Tang, RX 

作者机构:Harvard Univ Dept Stat Cambridge MA 02138 USA 

出 版 物:《STATISTICS AND COMPUTING》 (统计学与计算)

年 卷 期:2003年第13卷第2期

页      面:137-152页

核心收录:

学科分类:0202[经济学-应用经济学] 02[经济学] 020208[经济学-统计学] 07[理学] 0714[理学-统计学(可授理学、经济学学位)] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

基  金:National Science Foundation  NSF  (DMS-01-04129  DMS-97-05157) 

主  题:dynamic linear model EM algorithm MAP estimates one-step-late methods PXEM algorithm posterior modes probit regression rate of convergence working parameters 

摘      要:The EM algorithm is a popular method for computing maximum likelihood estimates or posterior modes in models that can be formulated in terms of missing data or latent structure. Although easy implementation and stable convergence help to explain the popularity of the algorithm, its convergence is sometimes notoriously slow. In recent years, however, various adaptations have significantly improved the speed of EM while maintaining its stability and simplicity. One especially successful method for maximum likelihood is known as the parameter expanded EM or PXEM algorithm. Unfortunately, PXEM does not generally have a closed form M-step when computing posterior modes, even when the corresponding EM algorithm is in closed form. In this paper we confront this problem by adapting the one-step-late EM algorithm to PXEM to establish a fast closed form algorithm that improves on the one-step-late EM algorithm by insuring monotone convergence. We use this algorithm to fit a probit regression model and a variety of dynamic linear models, showing computational savings of as much as 99.9%, with the biggest savings occurring when the EM algorithm is the slowest to converge.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分