We investigate the convergence of subgradient-oriented descent methods in non-smooth non-convex optimization. We prove convergence in the sense of subsequences for functions with a strict standard model, and we show t...
详细信息
We investigate the convergence of subgradient-oriented descent methods in non-smooth non-convex optimization. We prove convergence in the sense of subsequences for functions with a strict standard model, and we show that convergence to a single critical point may be guaranteed if the Kurdyka-Aojasiewicz inequality is satisfied. We show, by way of an example, that the Kurdyka-Aojasiewicz inequality alone is not sufficient to prove the convergence to critical points.
暂无评论