咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >FastAdaBelief: Improving Conve... 收藏
arXiv

FastAdaBelief: Improving Convergence Rate for Belief-based Adaptive Optimizers by Exploiting Strong Convexity

作     者:Zhou, Yangfan Huang, Kaizhu Cheng, Cheng Wang, Xuguang Hussain, Amir Liu, Xin 

作者机构:School of Nano-Tech and Nano-Bionics University of Science and Technology of China 96 Jinzhai Road Anhui Province Hefei230026 China  Chinese Academy of Sciences 398 Ruoshui Road Suzhou Industrial Park Jiangsu Province Suzhou215123 China Data Science Research Center Duke Kunshan University No. 8 Duke Avenue Kunshan215316 China The School of Computing Edinburgh Napier University EdinburghEH11 4BN United Kingdom Gusu Laboratory of Materials 388 Ruoshui Road Jiangsu Province Suzhou215123 China 

出 版 物:《arXiv》 (arXiv)

年 卷 期:2021年

核心收录:

主  题:Image classification 

摘      要:AdaBelief, one of the current best optimizers, demonstrates superior generalization ability over the popular Adam algorithm by viewing the exponential moving average of observed gradients. AdaBelief is theoretically appealing in that it has a data-dependent O(√T) regret bound when objective functions are convex, where T is a time horizon. It remains however an open problem whether the convergence rate can be further improved without sacrificing its generalization ability. To this end, we make a first attempt in this work and design a novel optimization algorithm called FastAdaBelief that aims to exploit its strong convexity in order to achieve an even faster convergence rate. In particular, by adjusting the step size that better considers strong convexity and prevents fluctuation, our proposed FastAdaBelief demonstrates excellent generalization ability as well as superior convergence. As an important theoretical contribution, we prove that FastAdaBelief attains a data-dependant O(log T) regret bound, which is substantially lower than AdaBelief in strongly convex cases. On the empirical side, we validate our theoretical analysis with extensive experiments in scenarios of strong convexity and non-convexity using three popular baseline models. Experimental results are very encouraging: FastAdaBelief converges the quickest in comparison to all mainstream algorithms while maintaining an excellent generalization ability, in cases of both strong convexity or non-convexity. FastAdaBelief is thus posited as a new benchmark model for the research community. © 2021, CC BY.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分