In this paper, we characterize Lipschitzian properties of different multiplier-free and multiplier-dependent perturbation mappings associated with the stationarity system of a so-called generalizednonlinear program p...
详细信息
In this paper, we characterize Lipschitzian properties of different multiplier-free and multiplier-dependent perturbation mappings associated with the stationarity system of a so-called generalizednonlinear program popularized by Rockafellar. Special emphasis is put on the investigation of the isolated calmness property at and around a point. The latter is decisive for the locally fast convergence of the so-called semismooth* Newton-type method by Gfrerer and Outrata. Our central result is the characterization of the isolated calmness at a point of a multiplier-free perturbation mapping via a combination of an explicit condition and a rather mild assumption, automatically satisfied e.g. for standard nonlinear programs. Isolated calmness around a point is characterized analogously by a combination of two stronger conditions. These findings are then related to so-called criticality of Lagrange multipliers, as introduced by Izmailov and extended to generalized nonlinear programming by Mordukhovich and Sarabi. We derive a new sufficient condition (a characterization for some problem classes) of nonexistence of critical multipliers, which has been also used in the literature as an assumption to guarantee local fast convergence of Newton-, SQP-, or multiplier-penalty-type methods. The obtained insights about critical multipliers seem to complement the vast literature on the topic.
Second-order sufficient conditions for local optimality have long been central to designing solution algorithms and justifying claims about their convergence. Here a far-reaching extension of such conditions, called v...
详细信息
Second-order sufficient conditions for local optimality have long been central to designing solution algorithms and justifying claims about their convergence. Here a far-reaching extension of such conditions, called variational sufficiency, is explored in territory beyond just classical nonlinearprogramming. Variational sufficiency is already known to support multiplier methods that are able, even without convexity, to achieve problem decomposition, but further insight has been needed into how it coordinates with other sufficient conditions. In the framework of this paper, it is shown to characterize local optimality in terms of having a convex-concave-type local saddle point of an augmented Lagrangian function. A stronger version of variational sufficiency is tied in turn to local strong convexity in the primal argument of that function and a property of augmented tilt stability that offers crucial aid to Lagrange multiplier methods at a fundamental level of analysis. Moreover, that strong version is translated here through second-order variational analysis into statements that can readily be compared to existing sufficient conditions in nonlinearprogramming, second-order cone programming, and other problem formulations which can incorporate nonsmooth objectives and regularization terms.
The augmented Lagrangian method (ALM) is extended to a broader-than-ever setting of generalized nonlinear programming in convex and nonconvex optimization that is capable of handling many common manifestations of nons...
详细信息
The augmented Lagrangian method (ALM) is extended to a broader-than-ever setting of generalized nonlinear programming in convex and nonconvex optimization that is capable of handling many common manifestations of nonsmoothness. With the help of a recently developed sufficient condition for local optimality, it is shown to be derivable from the proximal point algorithm through a kind of local duality corresponding to an optimal solution and accompanying multiplier vector that furnish a local saddle point of the augmented Lagrangian. This approach leads to surprising insights into stepsize choices and new results on linear convergence that draw on recent advances in convergence properties of the proximal point algorithm. Local linear convergence is shown to be assured for a class of model functions that covers more territory than before.
暂无评论