Dual ascent method (DAM) is an effective method for solving linearly constrained convex optimization problems. Classical DAM converges extremely slowly due to its small stepsize, and it has been improved through relax...
详细信息
Dual ascent method (DAM) is an effective method for solving linearly constrained convex optimization problems. Classical DAM converges extremely slowly due to its small stepsize, and it has been improved through relaxing the stepsize condition and introducing the self-adaptive stepsize rule by He et al, which increases its convergence speed. In this paper, we further relax its stepsize condition whereas the convergence result can still be guaranteed, providing the objective function is quadratic. We show the encouraging performance of the new DAM with new stepsize condition via the experiments on both synthetic and real problems.
暂无评论