版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:Department of Electrical Engineering and Computer Sciences University of California Berkeley United States Department of Applied Mathematics and Computer Science Ecole des Ponts ParisTech Ragon Institute of MGH MIT and Harvard Cambridge United States Chan-Zuckerberg Biohub San Francisco United States Department of Statistics University of California Berkeley United States Department of Statistics University of Michigan Ann Arbor United States
出 版 物:《arXiv》 (arXiv)
年 卷 期:2020年
核心收录:
主 题:Importance sampling
摘 要:To make decisions based on a model fit by Auto-Encoding Variational Bayes (AEVB), practitioners typically use importance sampling to estimate a functional of the posterior distribution. The variational distribution found by AEVB serves as the proposal distribution for importance sampling. However, this proposal distribution may give unreliable (high variance) importance sampling estimates, thus leading to poor decisions. We explore how changing the objective function for learning the variational distribution, while continuing to learn the generative model based on the ELBO, affects the quality of downstream decisions. For a particular model, we characterize the error of importance sampling as a function of posterior variance and show that proposal distributions learned with evidence upper bounds are better. Motivated by these theoretical results, we propose a novel variant of the VAE. In addition to experimenting with MNIST, we present a full-fledged application of the proposed method to single-cell RNA sequencing. In this challenging instance of multiple hypothesis testing, the proposed method surpasses the current state of the art. Copyright © 2020, The Authors. All rights reserved.