A Proximal Point Algorithm For Log-Determinant Optimization With Group Lazzo Regularization

548
91.3
Опубликовано 12 августа 2016, 2:08
Many applications of statistical learning involve estimating a sparse covariance matrix from a sample set of random variables. These problems can be formulated as sparse covariance selection problems, i.e. finding an estimation of the inverse covariance matrix by maximizing its log likelihood while imposing a sparsity constraint. The sparsity constraint is usually replaced by a penalized weighted L1-norm of the inverse covariance matrix to generate a convex relaxation of the original sparse covariance selection problem. We proposed an inexact interior point primal-dual path-following algorithm with Mehrotra-type predictor corrector search direction for L1-regularized log-determinant and covariance selection problems. At each interior-point iteration, the search direction is computed from a symmetric indefinite linear system (called augmented equation) with dimension m + n(n+1)/2, where m is the number of constraints and n is the dimension of covariance matrix. Such linear systems are large and dense when n is greater than a few hundreds and can only be solved by iterative methods. We build efficient preconditioners and only require an 'inexact' solution for the augmented equation at each interior-point iteration. Numerical experiments on a variety of large scale samples show that the proposed method is very efficient and robust.
автотехномузыкадетское