The Douglas Rachford algorithm (DRA) is a powerful optimization method for minimizing the sum of two convex (not necessarily smooth) functions. The vast majority of previous research dealt with the case when the sum h...
详细信息
The Douglas Rachford algorithm (DRA) is a powerful optimization method for minimizing the sum of two convex (not necessarily smooth) functions. The vast majority of previous research dealt with the case when the sum has at least one minimizer. In the absence of minimizers, it was recently shown that for the case of two indicator functions, the DRA converges to a best approximation solution. In this paper, we present a new convergence result on the DRA applied to the problem of minimizing a convex function subject to a linear constraint. Indeed, a normal solution may be found even when the domain of the objective function and the linear subspace constraint have no point in common. As an important application, a new parallel splitting result is provided. We also illustrate our results through various examples.
暂无评论