咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Mini-batch metropolis-hastings... 收藏
arXiv

Mini-batch metropolis-hastings MCMC with reversible SGLD proposal

作     者:Wu, Tung-Yu Rachel Wang, Y.X. Wong, Wing H. 

作者机构:Institute for Computational and Mathematical Engineering Stanford University School of Mathematics and Statistics University of Sydney Department of Statistics Stanford University 

出 版 物:《arXiv》 (arXiv)

年 卷 期:2019年

核心收录:

主  题:Stochastic systems 

摘      要:Traditional MCMC algorithms are computationally intensive and do not scale well to large data. In particular, the Metropolis-Hastings (MH) algorithm requires passing over the entire dataset to evaluate the likelihood ratio in each iteration. We propose a general framework for performing MH-MCMC using mini-batches of the whole dataset and show that this gives rise to approximately a tempered stationary distribution. We prove that the algorithm preserves the modes of the original target distribution and derive an error bound on the approximation with mild assumptions on the likelihood. To further extend the utility of the algorithm to high dimensional settings, we construct a proposal with forward and reverse moves using stochastic gradient and show that the construction leads to reasonable acceptance probabilities. We demonstrate the performance of our algorithm in both low dimensional models and high dimensional neural network applications. Particularly in the latter case, compared to popular optimization methods, our method is more robust to the choice of learning rate and improves testing accuracy. Copyright © 2019, The Authors. All rights reserved.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分