Firstly,this paper proposes a generalized log-determinant optimization model with the purpose of estimating the high-dimensional sparse inverse covariance *** the normality assumption,the zero components in the invers...
详细信息
Firstly,this paper proposes a generalized log-determinant optimization model with the purpose of estimating the high-dimensional sparse inverse covariance *** the normality assumption,the zero components in the inverse covariance matrices represent the conditional independence between pairs of variables given all the other *** generalized model considered in this study,because of the setting of the eigenvalue bounded constraints,covers a large number of existing estimators as special ***,rather than directly tracking the challenging optimization problem,this paper uses a couple of alternating direction methods of multipliers(ADMM)to solve its dual model where 5 separable structures are *** first implemented algorithm is based on a single Gauss–Seidel iteration,but it does not necessarily converge *** contrast,the second algorithm employs the symmetric Gauss–Seidel(sGS)based ADMM which is equivalent to the 2-block iterative scheme from the latest sGS decomposition ***,we do numerical simulations using the synthetic data and the real data set which show that both algorithms are very effective in estimating high-dimensional sparse inverse covariance matrix.
作者:
Li, PeiliXiao, YunhaiWuhan Univ
Sch Math & Stat Wuhan 430072 Hubei Peoples R China Henan Univ
Sch Math & Stat Kaifeng 475000 Peoples R China Henan Univ
Sch Math & Stat Inst Appl Math Kaifeng 475000 Peoples R China
Estimating large and sparse inverse covariance matrix plays a fundamental role in modern multivariate analysis, because the zero entries capture the conditional independence between pairs of variables given all other ...
详细信息
Estimating large and sparse inverse covariance matrix plays a fundamental role in modern multivariate analysis, because the zero entries capture the conditional independence between pairs of variables given all other variables. This estimation task can be realized by penalizing the maximum likelihood estimation with an adaptive group lasso penalty imposed directly on the elements of the inverse, which allows the estimated to have a blockwise sparse structure that is particularly useful in some applications. In the paper, we are particularly interested in studying the implementation of optimization algorithms for minimizing a class of log-determinant model. This considered minimization model, one the one hand, contains a large number of popular sparse models as special cases, but on the other hand, it poses more challenges especially in high-dimensional situations. Instead of targeting the challenging optimization problem directly, we employ the symmetric Gauss-Seidel (sGS) iteration based alternating direction method of multipliers (ADMM) to tackle the 3-block nonsmooth dual program. By choosing an appropriate proximal term, it was shown that the implemented sGS-ADMM is equivalent to the 2-block ADMM, so its convergence is followed directly from some existing theoretical results. Numerical experiments on synthetic data and real data sets, including the performance comparisons with the directly extended ADMM, demonstrate that the implemented algorithm is effective in estimating large and sparse inverse covariance matrices. (C) 2018 Elsevier B.V. All rights reserved.
With the appearance of approach named "robust alignment by sparse and low-rank decomposition" (RASL), a number of linearly correlated images can be accurately and robustly aligned despite significant corrupt...
详细信息
With the appearance of approach named "robust alignment by sparse and low-rank decomposition" (RASL), a number of linearly correlated images can be accurately and robustly aligned despite significant corruptions and occlusions. It has been discovered that this aligning task can be characterized as a sequence of 3-block convexminimization problems which can be solved efficiently by the accelerated proximal gradient method (APG), or alternatively, by the directly extended alternating direction method of multipliers (ADMM). However, the directly extended ADMM may diverge although it often performs well in numerical computations. Ideally, one should find an algorithm which can have both theoretical guarantee and superior numerical efficiency over the directly extended ADMM. We achieve this goal by using the intelligent symmetric Gauss-Seidel iteration based ADMM (sGS-ADMM) which only needs to update one of the variables twice, but surprisingly, it leads to the desired convergence to be guaranteed. The convergence of sGS-ADMM can be followed directly by relating it to the classical 2-block ADMM and with a couple of specially designed semi-proximal terms. Beyond this, we also add a rank correction term to the model with the purpose of deriving the alignment results with higher accuracy. The numerical experiments over a wide range of realistic misalignments demonstrate that sGS-ADMM is at least two times faster than RASL and APG for the vast majority of the tested problems. (C) 2018 Elsevier B.V. All rights reserved.
Magnetic Resonance Imaging (MRI) is a kind of medical imaging technology used for diagnostic imaging of diseases, but its image quality may be suffered by the long acquisition time. The compressive sensing (CS) based ...
详细信息
Magnetic Resonance Imaging (MRI) is a kind of medical imaging technology used for diagnostic imaging of diseases, but its image quality may be suffered by the long acquisition time. The compressive sensing (CS) based strategy may decrease the reconstruction time greatly, but it needs efficient reconstruction algorithms to produce high-quality and reliable images. This paper focuses on the algorithmic improvement for the sparse reconstruction of CS-MRI, especially considering a non-smooth convex minimization problem which is composed of the sum of a total variation regularization term and a l(1)-norm term of the wavelet transformation. The partly motivation of targeting the dual problem is that the dual variables are involved in relatively low dimensional subspace. Instead of solving the primal model as usual, we turn our attention to its associated dual model composed of three variable blocks and two separable non-smooth function blocks. However, the directly extended alternating direction method of multipliers (ADMM) must be avoided because it may be divergent, although it usually performs well numerically. In order to solve the problem, we employ a symmetric Gauss-Seidel (sGS) technique based ADMM. Compared with the directly extended ADMM, this method only needs one additional iteration, but its convergence can be guaranteed theoretically. Besides, we also propose a generalized variant of ADMM because this method has been illustrated to be efficient for solving semidefinite programming in the past few years. Finally, we do extensive experiments on MRI reconstruction using some simulated and real MRI images under different sampling patterns and ratios. The numerical results demonstrate that the proposed algorithms significantly achieve high reconstruction accuracies with fast computational speed.
We analyze the proximal alternating linearized minimization algorithm (PALM) for solving non-smooth convex minimization problems where the objective function is a sum of a smoothconvex function and block separable no...
详细信息
We analyze the proximal alternating linearized minimization algorithm (PALM) for solving non-smooth convex minimization problems where the objective function is a sum of a smoothconvex function and block separable non-smooth extended real-valued convex functions. We prove a global non-asymptotic sublinear rate of convergence for PALM. When the number of blocks is two, and the smooth coupling function is quadratic we present a fast version of PALM which is proven to share a global sublinear rate efficiency estimate improved by a squared root factor. Some numerical examples illustrate the potential benefits of the proposed schemes.
暂无评论