咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Explicit and Implicit Graduate... 收藏
arXiv

Explicit and Implicit Graduated Optimization in Deep Neural Networks

作     者:Sato, Naoki Iiduka, Hideaki 

作者机构:Meiji University Japan 

出 版 物:《arXiv》 (arXiv)

年 卷 期:2024年

核心收录:

主  题:Optimization algorithms 

摘      要:Graduated optimization is a global optimization technique that is used to minimize a multimodal nonconvex function by smoothing the objective function with noise and gradually refining the solution. This paper experimentally evaluates the performance of the explicit graduated optimization algorithm with an optimal noise scheduling derived from a previous study and discusses its limitations. It uses traditional benchmark functions and empirical loss functions for modern neural network architectures for evaluating. In addition, this paper extends the implicit graduated optimization algorithm, which is based on the fact that stochastic noise in the optimization process of SGD implicitly smooths the objective function, to SGD with momentum, analyzes its convergence, and demonstrates its effectiveness through experiments on image classification tasks with ResNet architectures. © 2024, CC BY.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分