Skip to main content
eScholarship
Open Access Publications from the University of California

UC Berkeley

UC Berkeley Electronic Theses and Dissertations bannerUC Berkeley

Convex geometric tools in information theory

Abstract

The areas of information theory and geometry mirror each other in remarkable ways, with several concepts in geometry having analogues in information theory. These observations provide a simple way to posit theorems in one area by translating the corresponding theorems in the other. However, the analogy does not extended fully, and the proof techniques often do not carry over without substantial modification. One reason for this is that information theoretic quantities are often defined asymptotically, as the dimension tends to infinity. This is in contrast to the setting in geometry, where the dimension is usually fixed. In this dissertation, we try to bridge the gap between these two areas by studying the asymptotic geometric properties of sequences of sets. Our main contribution is developing a theory to study the growth rates of intrinsic volumes for sequences of convex sets satisfying some natural growth constraints. As an illustration of the usefulness of our techniques, we consider two specific problems. The first problem is that of analyzing the Shannon capacities of power-constrained communication channels. In particular, we study a power-constrained channel arising out of the energy harvesting communication model, called the $(\sigma, \rho)$-power constrained additive white Gaussian noise (AWGN) channel. Our second problem deals with forging new connections between geometry and information theory by studying the intrinsic volumes of sequences of typical sets. For log-concave distributions, we show the existence of a new quantity called the intrinsic entropy, which can be interpreted as a generalization of differential entropy.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View