When the Newton-Raphson algorithm or the Fisher scoring algorithm does not work and the EM-type algorithms are not available, the quadratic lower-bound (qlb) algorithm may be a useful optimization tool. However, like ...
详细信息
When the Newton-Raphson algorithm or the Fisher scoring algorithm does not work and the EM-type algorithms are not available, the quadratic lower-bound (qlb) algorithm may be a useful optimization tool. However, like all EM-type algorithms, the qlbalgorithm may also suffer from slow convergence which can be viewed as the cost for having the ascent property. This paper proposes a novel 'shrinkage parameter' approach to accelerate the qlbalgorithm while maintaining its simplicity and stability (i.e., monotonic increase in log-likelihood). The strategy is first to construct a class of quadratic surrogate functions Qr(theta vertical bar theta((t))) that induces a class of qlbalgorithms indexed by a 'shrinkage parameter' r (r is an element of R) and then to optimize r over R under some criterion of convergence. For three commonly used criteria (i.e., the smallest eigenvalue, the trace and the determinant), we derive a uniformly optimal shrinkage parameter and find an optimal qlb algorithm. Some theoretical justifications are also presented. Next, we generalize the optimal qlb algorithm to problems with penalizing function and then investigate the associated properties of convergence. The optimal qlb algorithm is applied to fit a logistic regression model and a Cox proportional hazards model. Two real datasets are analyzed to illustrate the proposed methods. (C) 2011 Elsevier B.V. All rights reserved.
暂无评论