Skip to main content
eScholarship
Open Access Publications from the University of California

UC Berkeley

UC Berkeley Electronic Theses and Dissertations bannerUC Berkeley

Learning in a Changing World:\Covariate Shift, Subset Selection and Optimal PAC Bounds

Abstract

Learning theory has been a central topic in computer science for decades. Typical results in this field have been focused on the case when the change in the data distribution is minimal. Common wisdom is that learning is significantly more challenging, both computationally and statistically, when the data distribution changes. In this thesis, we develop new techniques to bridge the gap between learning in a stationary environment and learning in a changing environment. Further, our techniques allow us to convert existing algorithms to one that satisfy the required robustness without sacrificing computational and statistical efficiency.

The first major contribution is the development of the smoothed online learning, a framework that interpolates between online learning and statistical learning. We develop optimal statistical and computational guarantees in this framework, establishing guarantees that nearly match the statistical case while allowing for data that changes over time. We further establish the power of this framework by showing that it can also lead to improved guarantees for private learning. A second contribution is establishing general techniques to efficiently convert algorithms that have error guarantees in expectation to algorithms that have error guarantees with high probability, without requiring uniform convergence. A third contribution is development of techniques for data selection towards improving both the computational efficiency and robustness of algorithms to distributional changes.

Main Content
For improved accessibility of PDF content, download the file to your device.