Skip to main content
eScholarship
Open Access Publications from the University of California

UC San Diego

UC San Diego Electronic Theses and Dissertations bannerUC San Diego

Boltzmann Energetics and Temporal Dynamics of Learning Neuromorphic Systems

Abstract

The brain's cognitive power does not arise on exacting digital precision in high-performance computing, but emerges from an extremely efficient and resilient collective form of computation extending over very large ensembles of sluggish, imprecise, and unreliable analog components. In contrast to the reliable spike generation mechanism of cortical neurons, synapses are regarded as the primary source of this probabilistic behavior owing to release failures and quantal fluctuations. It has been speculated that the overall power efficiency and noise tolerance of the brain is a result of this unreliability in communication between neurons. Inspired by the stochastic nature of brain dynamics, we present methods of exploiting these concepts in order to produce more efficient algorithms and systems in the realm of neuromorphic computing, offering links between two traditionally disjoint scientific disciplines: computational neuroscience concerned with constructing models of brain function, and machine learning concerned with realizing adaptive computational intelligence.

The first part of the dissertation investigates extensions on the Boltzmann machine, a stochastic recurrent artificial neural network capable of learning probability distributions over its inputs. Boltzmann machines are interesting from a neuromorphic perspective due to the local and Hebbian nature of their learning rule, along with parallel processing between network layers with which biological neural networks also operate. Additionally, the neurons in these networks present probabilistic activation functions and communicate with binary events, similar to what has been observed in experimental recordings of neural data. In search of more biological plausibility in inference and learning, we present conditions for significant equivalence between Boltzmann machines with contrastive divergence machine learning and integrate-and-fire neuronal networks with spike-timing-dependent plasticity (STDP). Next, we extend our methods to networks whose sole source of stochasticity pertains to the synapse, showing that synaptic noise can produce an efficient means of sampling. As the hallmark learning rule in spiking neural networks, we then investigate how STDP can be readily performed using simply forward connectivity access, and compare different data structures for organizing synaptic weights for memory efficiency.

In the second part of the dissertation, we focus on learning neuromorphic systems and applications, including a methodology and an automation tool for implementing generative models of Boltzmann machines with digital spiking neurons. Next, we demonstrate how sparsely active neurons are capable of producing efficient results in a small-footprint keyword spotting application. Lastly, we present our ongoing work in designing a very large-scale reconfigurable digital neuromorphic system, tailored for both the machine learning as well as the computational neuroscience communities, which exploits the stochastic and temporal coding strategies developed in the first part of the dissertation and serves as an openly shared platform for further community-driven research in low-precision computation and event-driven processing.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View