Skip to main content
eScholarship
Open Access Publications from the University of California

UC Berkeley

UC Berkeley Electronic Theses and Dissertations bannerUC Berkeley

Randomized Pivoting and Spectrum-Revealing Bounds in Numerical Linear Algebra

Abstract

In the first part of this dissertation, we explore a novel randomized pivoting strategy to efficiently improve the reliability and quality of the LU factorization. Gaussian elimination with partial pivoting (GEPP) has long been among the most widely used methods for computing the LU factorization of a given matrix. However, this method is also known to fail for matrices that induce large element growth during the factorization process. We propose a new scheme, Gaussian elimination with randomized complete pivoting (GERCP) for the efficient and reliable LU factorization of a given matrix. GERCP satisfies GECP (Gaussian elimination with complete pivoting) style element growth bounds with high probability, yet costs only marginally more than GEPP in terms of algorithmic complexity and run-time. Our numerical experimental results strongly suggest that GERCP is as reliable as GECP and as efficient as GEPP for computing the LU factorization. In the second part, this dissertation provides tighter and simplified analyses of various popular low-rank matrix approximation algorithms included randomized subspace iteration and column/row selection based methods. We derive new bounds and unify them with other existing bounds under the title Spectrum-Revealing Bounds. These bounds demonstrate how certain structure in the decay of the spectrum of a matrix help to ``reveal'' an increasingly accurate estimate to the low-rank matrix approximation. We provide real world applications that demonstrate the qualitative value of our bounds for anyone using low-rank matrix approximations. In the case of randomized subspace iteration, we also dramatically improve and simplify the probabilistic analysis from previous works using intuitive and concise techniques. Lastly, we apply the idea of efficient low-rank matrix approximation in the presence of spectral decay to help speed up sparse principle components analysis (SPCA). We also develop novel lower bounds on the variance captured by each sparse principle component obtained after deflation.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View