咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Accelerated First-Order Optimi... 收藏
arXiv

Accelerated First-Order Optimization under Nonlinear Constraints

作     者:Muehlebach, Michael Jordan, Michael I. 

作者机构:Learning and Dynamical Systems Max Planck Institute for Intelligent Systems Max-Planck-Ring 4 Baden-Wuerttemberg Tübingen72076 Germany Department of Electrical Engineering and Computer Science University of California Berkeley 387 Soda Hall BerkeleyCA94720 United States 

出 版 物:《arXiv》 (arXiv)

年 卷 期:2023年

核心收录:

主  题:Constrained optimization 

摘      要:We exploit analogies between first-order algorithms for constrained optimization and non-smooth dynamical systems to design a new class of accelerated first-order algorithms for constrained optimization. Unlike Frank-Wolfe or projected gradients, these algorithms avoid optimization over the entire feasible set at each iteration. We prove convergence to stationary points even in a nonconvex setting and we derive accelerated rates for the convex setting both in continuous time, as well as in discrete time. An important property of these algorithms is that constraints are expressed in terms of velocities instead of positions, which naturally leads to sparse, local and convex approximations of the feasible set (even if the feasible set is nonconvex). Thus, the complexity tends to grow mildly in the number of decision variables and in the number of constraints, which makes the algorithms suitable for machine learning applications. We apply our algorithms to a compressed sensing and a sparse regression problem, showing that we can treat nonconvex p constraints (p © 2023, CC BY.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分