Skip to main content
eScholarship
Open Access Publications from the University of California

UC San Diego

UC San Diego Electronic Theses and Dissertations bannerUC San Diego

Structured Learning with Scale Mixture Priors

Abstract

Sparsity plays an essential role in a number of modern algorithms. This thesis examines how we can incorporate additional structural information within the sparsity profile and formulate a richer class of learning approaches. The focus is on Bayesian techniques for promoting sparsity and developing novel priors and inference schemes.

The thesis begins by showing how structured sparsity can be used to recover simultaneously block sparse signals in the presence of outliers. The approach is validated with empirical results on synthetic data experiments as well as the multiple measurement face recognition problem.

In the next portion of the thesis, the focus is on how structured sparsity can be used to extend approaches for dictionary learning. Dictionary learning refers to decomposing a data matrix into the product of a dictionary and coefficient matrix, subject to a sparsity constraint on the coefficient matrix.

Chapter 3 studies structure in the form of non-negativity constraints on the unknowns, which is referred to as the sparse non-negative least squares (S-NNLS) problem. It presents a unified framework for S-NNLS based on a novel prior on the sparse codes and provides an

efficient multiplicative inference procedure. It then extends the framework to sparse non-negative matrix factorization (S-NMF) and proves that the proposed approach is guaranteed to converge to a set of stationary points for both the S-NNLS and a subclass of the S-NMF problems.

Finally, Chapter 4 addresses the problem of learning dictionaries for multimodal datasets. It presents the multimodal sparse Bayesian dictionary learning (MSBDL) algorithm. The MSBDL algorithm is able to leverage information from all available data modalities through a joint sparsity constraint on each modality’s sparse codes without restricting the coefficients themselves to be equal. The proposed framework offers a considerable amount of flexibility to practitioners and addresses many of the shortcomings of existing multimodal dictionary learning approaches. Unlike existing approaches, MSBDL allows the dictionaries for each data modality to have

different cardinality. In addition, MSBDL can be used in numerous scenarios, from small datasets to extensive datasets with large dimensionality. MSBDL can also be used in supervised settings.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View