咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Almost Sure Convergence of Pro... 收藏

Almost Sure Convergence of Proximal Stochastic Accelerated Gradient Methods

Almost Sure Convergence of Proximal Stochastic Accelerated Gradient Methods

作     者:Xin Xiang Haoming Xia Xin Xiang;Haoming Xia

作者机构:Key Laboratory of Optimization Theory and Applications School of Mathematics and Information China West Normal University Nanchong China 

出 版 物:《Journal of Applied Mathematics and Physics》 (应用数学与应用物理(英文))

年 卷 期:2024年第12卷第4期

页      面:1321-1336页

学科分类:07[理学] 0701[理学-数学] 070101[理学-基础数学] 

主  题:Proximal Stochastic Accelerated Method Almost Sure Convergence Composite Optimization Non-Smooth Optimization Stochastic Optimization Accelerated Gradient Method 

摘      要:Proximal gradient descent and its accelerated version are resultful methods for solving the sum of smooth and non-smooth problems. When the smooth function can be represented as a sum of multiple functions, the stochastic proximal gradient method performs well. However, research on its accelerated version remains unclear. This paper proposes a proximal stochastic accelerated gradient (PSAG) method to address problems involving a combination of smooth and non-smooth components, where the smooth part corresponds to the average of multiple block sums. Simultaneously, most of convergence analyses hold in expectation. To this end, under some mind conditions, we present an almost sure convergence of unbiased gradient estimation in the non-smooth setting. Moreover, we establish that the minimum of the squared gradient mapping norm arbitrarily converges to zero with probability one.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分