版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:Univ Paris Saclay CentraleSupelec CVN Inria Gif Sur Yvette France GE Healthcare Buc France
出 版 物:《INVERSE PROBLEMS》 (逆问题)
年 卷 期:2021年第37卷第6期
核心收录:
学科分类:07[理学] 0701[理学-数学] 0702[理学-物理学]
基 金:European Research Council [ERC-2019-STG-850925] ANRT CIFRE Convention [2018/1587] ANR AI Chair BRIGEABLE Institut Universitaire de France
主 题:adjoint mismatch convex optimization convergence analysis fixed point methods image reconstruction computed tomography forward-backward algorithm
摘 要:We consider the proximal gradient algorithm for solving penalized least-squares minimization problems arising in data science. This first-order algorithm is attractive due to its flexibility and minimal memory requirements allowing to tackle large-scale minimization problems involving non-smooth penalties. However, for problems such as x-ray computed tomography, the applicability of the algorithm is dominated by the cost of applying the forward linear operator and its adjoint at each iteration. In practice, the adjoint operator is thus often replaced by an alternative operator with the aim to reduce the overall computation burden and potentially improve conditioning issues. In this paper, we propose to analyze the effect of such an adjoint mismatch on the convergence of the proximal gradient algorithm in an infinite-dimensional setting, thus generalizing the existing results on PGA. We derive conditions on the step-size and on the gradient of the smooth part of the objective function under which convergence of the algorithm to a fixed point is guaranteed. We also derive bounds on the error between this point and the solution to the original minimization problem. We illustrate our theoretical findings with two image reconstruction tasks in computed tomography.