For smooth convex optimization problems, the optimal convergence rate of first-order algorithm is O(1/k(2)) in theory. This paper proposes three improved accelerated gradient algorithms with the gradient information a...
详细信息
For smooth convex optimization problems, the optimal convergence rate of first-order algorithm is O(1/k(2)) in theory. This paper proposes three improved accelerated gradient algorithms with the gradient information at the latest point. For the step size, to avoid using the global Lipschitz constant and make the algorithm converge faster, new adaptive line search strategies are adopted. By constructing a descent Lyapunov function, we prove that the proposed algorithms can preserve the convergence rate of O(1/k(2)). Numerical experiments demonstrate that our algorithms perform better than some existing algorithms which have optimal convergence rate.
暂无评论