mini-batch algorithms have become increasingly popular due to the requirement for solving optimization problems, based on large-scale data sets. Using an existing online expectation-maximization (EM) algorithm framewo...
详细信息
mini-batch algorithms have become increasingly popular due to the requirement for solving optimization problems, based on large-scale data sets. Using an existing online expectation-maximization (EM) algorithm framework, we demonstrate how mini-batch (MB) algorithms may be constructed, and propose a scheme for the stochastic stabilization of the constructed mini-batch algorithms. Theoretical results regarding the convergence of the mini-batch EM algorithms are presented. We then demonstrate how the mini-batch framework may be applied to conduct maximum likelihood (ML) estimation of mixtures of exponential family distributions, with emphasis on ML estimation for mixtures of normal distributions. Via a simulation study, we demonstrate that the mini-batch algorithm for mixtures of normal distributions can outperform the standard EM algorithm. Further evidence of the performance of the mini-batch framework is provided via an application to the famous MNIST data set.
Fuzzy co-clustering schemes including Fuzzy Co-Clustering induced by Multinomial Mixture models (FCCMM) are promising approaches for analyzing object-item cooccurrence information such as document-keyword frequencies ...
详细信息
ISBN:
(纸本)9781538674116
Fuzzy co-clustering schemes including Fuzzy Co-Clustering induced by Multinomial Mixture models (FCCMM) are promising approaches for analyzing object-item cooccurrence information such as document-keyword frequencies and customer-product purchase history transactions. However, such cooccurrence datasets are generally maintained as very large matrices and cannot be dealt with conventional batchalgorithms. Online algorithms that load sequentially a single object for adjusting parameters are effective approaches for big data analysis. mini-batch algorithms that load sequentially a small chunk (mini-batch) of objects for adjusting parameters are also effective. In this paper, we propose an online algorithm for FCCMM clustering and a mini-batch algorithm for FCCMM clustering and observe their characteristics and performance through numerical experiments.
We prove the convergence to minima and estimates on the rate of convergence for the stochastic gradient descent method in the case of not necessarily locally convex nor contracting objective functions. In particular, ...
详细信息
We prove the convergence to minima and estimates on the rate of convergence for the stochastic gradient descent method in the case of not necessarily locally convex nor contracting objective functions. In particular, the analysis relies on a quantitative use of mini-batches to control the loss of iterates to non-attracted regions. The applicability of the results to simple objective functions arising in machine learning is shown.
We prove the convergence to minima and estimates on the rate of convergence for the stochastic gradient descent method in the case of not necessarily locally convex nor contracting objective functions. In particular, ...
详细信息
We prove the convergence to minima and estimates on the rate of convergence for the stochastic gradient descent method in the case of not necessarily locally convex nor contracting objective functions. In particular, the analysis relies on a quantitative use of mini-batches to control the loss of iterates to non-attracted regions. The applicability of the results to simple objective functions arising in machine learning is shown.
暂无评论