We study the least-squares regression problem over a Hilbert space, covering nonparametric regression over a reproducing kernel Hilbert space as a special case. We first investigate regularized algorithms adapted to a...
详细信息
We study the least-squares regression problem over a Hilbert space, covering nonparametric regression over a reproducing kernel Hilbert space as a special case. We first investigate regularized algorithms adapted to a projection operator on a closed subspace of the Hilbert space. We prove convergence results with respect to variants of norms, under a capacity assumption on the hypothesis space and a regularity condition on the target function. As a result, we obtain optimal rates for regularized algorithms with randomized sketches, provided that the sketch dimension is proportional to the effective dimension up to a logarithmic factor. As a byproduct, we obtain similar results for Nystrom regularized algorithms. Our results provide optimal, distribution-dependent rates that do not have any saturation effect for sketched/Nystrom regularized algorithms, considering both the attainable and non-attainable cases, in the well-conditioned regimes. We then study stochastic gradient methods with projection over the subspace, allowing multi-pass over the data and minibatches, and we derive similar optimal statistical convergence results.
We study the least-squares regression problem over a Hilbert space, covering nonparametric regression over a reproducing kernel Hilbert space as a special case. We first investigate regularized algorithms adapted to a...
详细信息
We study the least-squares regression problem over a Hilbert space, covering nonparametric regression over a reproducing kernel Hilbert space as a special case. We first investigate regularized algorithms adapted to a projection operator on a closed subspace of the Hilbert space. We prove convergence results with respect to variants of norms, under a capacity assumption on the hypothesis space and a regularity condition on the target function. As a result, we obtain optimal rates for regularized algorithms with randomized sketches, provided that the sketch dimension is proportional to the effective dimension up to a logarithmic factor. As a byproduct, we obtain similar results for Nyström regularized algorithms. Our results provide optimal, distribution-dependent rates that do not have any saturation effect for sketched/Nyström regularized algorithms, considering both the attainable and non-attainable cases, in the well-conditioned regimes. We then study stochastic gradient methods with projection over the subspace, allowing multi-pass over the data and minibatches, and we derive similar optimal statistical convergence results.
The article presents the regularized algorithms for identification of uncertain dynamic management objects. Various algorithms for identifying dynamic control objects are analyzed in the conditions of approximate assi...
详细信息
The convergence analysis of a variable KM-like method for approximating common fixed points of a possibly infinitely countable family of nonexpansive mappings in a Hilbert space is proposed and proved to be strongly c...
详细信息
The convergence analysis of a variable KM-like method for approximating common fixed points of a possibly infinitely countable family of nonexpansive mappings in a Hilbert space is proposed and proved to be strongly convergent to a common fixed point of a family of nonexpansive mappings. Our variable KM-like technique is applied to solve the split feasibility problem and the multiple-sets split feasibility problem. Especially, the minimum norm solutions of the split feasibility problem and the multiple-sets split feasibility problem are derived. Our results can be viewed as an improvement and refinement of the previously known results.
暂无评论