Skip to main content
eScholarship
Open Access Publications from the University of California

UC Berkeley

UC Berkeley Electronic Theses and Dissertations bannerUC Berkeley

Abstraction, Generalization, and Embodiment in Neural Program Synthesis

Abstract

Program synthesis, or automatically writing programs from high-level specifications has been a long-standing challenge in computer science and artificial intelligence. Addressing this challenge can help unlock the full power of computing to nontechnical users, assist existing developers on traditional programming tasks, and solve other tasks in artificial intelligence like question answering that are naturally expressible as programs. In recent years, neural methods for program synthesis based on learning have driven significant progress. With this shift, themes like abstraction, generalization, and embodiment that recur in other facets of machine learning provide a natural framework for further improvement. In this dissertation, we present methods to address manifestations of these themes in several concrete instantiations of neural program synthesis.

First, we demonstrate how to better synthesize imperative programs by interacting with the program interpreter environment in the form of predicted execution traces, in a challenging domain for program synthesis called Karel. We also show in empirical studies that generating synthetic data for program synthesis requires significant care to enable models to generalize. In an application of program synthesis from natural language, or semantic parsing, we present attention-based neural architectures that can better encode the natural language specification to enable better generalization to new database domains. In this and other code generation domains, we introduce a method for integrating automatically learned code idioms into the synthesis procedure, learning to automatically switch between multiple levels of abstraction.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View