Skip to main content
eScholarship
Open Access Publications from the University of California

UCLA

UCLA Electronic Theses and Dissertations bannerUCLA

Estimation of Graphical Models: Convex Formulations and Algorithms

Abstract

A Gaussian graphical model is a graph representation of conditional independence relations among Gaussian random variables. A fundamental problem in the estimation of Gaussian graphical models is the selection of the graph topology given relatively small amounts of data. This problem is often solved via L1-regularized maximum likelihood estimation, for which many large-scale convex optimization algorithms have been developed. In this thesis, we consider several extensions of Gaussian graphical models and develop fast algorithms based on convex optimization methods.

As a first extension, we consider the restricted sparse inverse covariance selection problem where the set of zero entries of the inverse covariance matrix is partially known and an L1-norm penalization is applied to the remaining entries.The proximal Newton method is an attractive algorithm for this problem since the key computations in the algorithm, which include the evaluation of gradient and Hessian of the log-likelihood function, can be implemented efficiently with sparse chordal matrix techniques. We analyze the convergence of the inexact proximal Newton method for the penalized maximum likelihood problem. The convergence analysis applies to a wider class of problems with a self-concordant term in the objective. The numerical results indicate that the method can reach a high accuracy, even with inexact computation of the proximal Newton steps.

As a second extension, we consider Gaussian graphical models for time series, with focus on the estimation of multiple time series graphical models with similar graph structures or identical graph structure but different edge coefficients. We formulate a joint estimation method for estimating multiple time series graphical models simultaneously, with a group penalty on the edge coefficients for different models. We apply the Douglas-Rachford algorithm to solve the estimation problem for the joint model, and provide model selection methods for choosing parameters. Both synthetic and real data (fMRI brain activity and international stock markets) examples are provided to demonstrate the advantage of the joint estimation method.

The last extension is the generalization of Gaussian graphical models for time series to latent variables. We illustrate the effect of latent variables on the conditional independence structure, and describe a Gaussian graphical model for time series with latent variables. The Douglas-Rachford method is applied to this problem. Simulations with synthetic data demonstrate how the method recovers the graph topology.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View