咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Gradient descent with random i... 收藏

Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval

有随机的初始化的坡度降下: 为 nonconvex 阶段检索的快全球集中

作     者:Chen, Yuxin Chi, Yuejie Fan, Jianqing Ma, Cong 

作者机构:Princeton Univ Dept Elect Engn Princeton NJ 08544 USA Carnegie Mellon Univ Dept Elect & Comp Engn Pittsburgh PA 15213 USA Princeton Univ Dept Operat Res & Financial Engn Princeton NJ 08544 USA 

出 版 物:《MATHEMATICAL PROGRAMMING》 (数学规划)

年 卷 期:2019年第176卷第1-2期

页      面:5-37页

核心收录:

学科分类:1201[管理学-管理科学与工程(可授管理学、工学学位)] 07[理学] 070104[理学-应用数学] 0835[工学-软件工程] 0701[理学-数学] 

基  金:AFOSR YIP award [FA9550-19-1-0030] ARO [W911NF-18-1-0303] ONR [N00014-19-1-2120, N00014-18-1-2142] Princeton SEAS innovation award AFOSR [FA9550-15-1-0205] NSF [CAREER ECCS-1818571, CCF-1806154, DMS-1662139, DMS-1712591] NIH [2R01-GM072611-13] 

主  题:initialization Optimization algorithms descent Physical science DECOUPLING Gauss Gradient retrieval global convergence 

摘      要:This paper considers the problem of solving systems of quadratic equations, namely, recovering an object of interest x?Rn from m quadratic equations/samples yi=(ai?x?)2,1im. This problem, also dubbed as phase retrieval, spans multiple domains including physical sciences and machine learning. We investigate the efficacy of gradient descent (or Wirtinger flow) designed for the nonconvex least squares problem. We prove that under Gaussian designs, gradient descentwhen randomly initializedyields an E-accurate solution in O(logn+log(1/E)) iterations given nearly minimal samples, thus achieving near-optimal computational and sample complexities at once. This provides the first global convergence guarantee concerning vanilla gradient descent for phase retrieval, without the need of (i) carefully-designed initialization, (ii) sample splitting, or (iii) sophisticated saddle-point escaping schemes. All of these are achieved by exploiting the statistical models in analyzing optimization algorithms, via a leave-one-out approach that enables the decoupling of certain statistical dependency between the gradient descent iterates and the data.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分