Primal-dual proximal optimization methods have recently gained much interest for dealing with very large-scale data sets encoutered in many application fields such as machine learning, computer vision and inverse prob...
详细信息
Primal-dual proximal optimization methods have recently gained much interest for dealing with very large-scale data sets encoutered in many application fields such as machine learning, computer vision and inverse prob...
详细信息
ISBN:
(纸本)9781467369985
Primal-dual proximal optimization methods have recently gained much interest for dealing with very large-scale data sets encoutered in many application fields such as machine learning, computer vision and inverse problems [1-3]. In this work, we propose a novel random block-coordinate version of such algorithms allowing us to solve a wide array of convex variational problems. One of the main advantages of the proposed algorithm is its ability to solve composite problems involving large-size matrices without requiring any inversion. In addition, the almost sure convergence to an optimal solution to the problem is guaranteed. We illustrate the good performance of our method on a mesh denoising application.
This work proposes block-coordinate fixed point algorithms with applications to nonlinear analysis and optimization in Hilbert spaces. The asymptotic analysis relies on a notion of stochastic quasi-Fejer monotonicity,...
详细信息
This work proposes block-coordinate fixed point algorithms with applications to nonlinear analysis and optimization in Hilbert spaces. The asymptotic analysis relies on a notion of stochastic quasi-Fejer monotonicity, which is thoroughly investigated. The iterative methods under consideration feature random sweeping rules to select arbitrarily the blocks of variables that are activated over the course of the iterations and they allow for stochastic errors in the evaluation of the operators. algorithms using quasi-nonexpansive operators or compositions of averaged nonexpansive operators are constructed, and weak and strong convergence results are established for the sequences they generate. As a by-product, novel block-coordinate operator splitting methods are obtained for solving structured monotone inclusion and convex minimization problems. In particular, the proposed framework leads to random block-coordinate versions of the Douglas-Rachford and forward-backward algorithms and of some of their variants. In the standard case of m = 1 block, our results remain new as they incorporate stochastic perturbations.
Combettes and Pesquet (SIAM J Optim 25:1221-1248,2015) investigated the almost sure weak convergence of block-coordinate fixed point algorithms and discussed their applications to nonlinear analysis and optimization. ...
详细信息
Combettes and Pesquet (SIAM J Optim 25:1221-1248,2015) investigated the almost sure weak convergence of block-coordinate fixed point algorithms and discussed their applications to nonlinear analysis and optimization. This algorithmic framework features random sweeping rules to select arbitrarily the blocks of variables that are activated over the course of the iterations and it allows for stochastic errors in the evaluation of the operators. The present paper establishes results on the mean-square and linear convergence of the iterates. Applications to monotone operator splitting and proximal optimization algorithms are presented.
Recent random block-coordinate fixed point algorithms are particularly well suited to large-scale optimization in signal and image processing. These algorithms feature random sweeping rules to select arbitrarily the b...
详细信息
ISBN:
(纸本)9789082797015
Recent random block-coordinate fixed point algorithms are particularly well suited to large-scale optimization in signal and image processing. These algorithms feature random sweeping rules to select arbitrarily the blocks of variables that are activated over the course of the iterations and they allow for stochastic errors in the evaluation of the operators. The present paper provides new linear convergence results. These convergence rates are compared to those of standard deterministic algorithms both theoretically and experimentally in an image recovery problem.
Sampling-based algorithms are classical approaches to perform Bayesian inference in inverse problems. They provide estimators with the associated credibility intervals to quantify the uncertainty on the estimators. Al...
详细信息
Sampling-based algorithms are classical approaches to perform Bayesian inference in inverse problems. They provide estimators with the associated credibility intervals to quantify the uncertainty on the estimators. Although these methods hardly scale to high dimensional problems, they have recently been paired with optimization techniques, such as proximal and splitting approaches, to address this issue. Such approaches pave the way to distributed samplers, splitting computations to make inference more scalable and faster. We introduce a distributed Split Gibbs sampler (SGS) to efficiently solve such problems involving distributions with multiple smooth and non-smooth functions composed with linear operators. The proposed approach leverages a recent approximate augmentation technique reminiscent of primal-dual optimization methods. It is further combined with a block-coordinate approach to split the primal and dual variables into blocks, leading to a distributed block-coordinate SGS. The resulting algorithm exploits the hypergraph structure of the involved linear operators to efficiently distribute the variables over multiple workers under controlled communication costs. It accommodates several distributed architectures, such as the Single Program Multiple Data and client-server architectures. Experiments on a large image deblurring problem show the performance of the proposed approach to produce high quality estimates with credibility intervals in a small amount of time. Supplementary material to reproduce the experiments is available online.
暂无评论