Skip to main content
eScholarship
Open Access Publications from the University of California

UC Merced

UC Merced Electronic Theses and Dissertations bannerUC Merced

Data-Based Motion Planning for Full-Body Virtual Human Interaction with the Environment

Abstract

Autonomous virtual characters are important in a growing number of applications ranging from simulation-based training to computer games. In this dissertation I propose a new data-based mobile manipulation framework for achieving real-time autonomous characters able to perform full-body interactions with the environment. The overall approach relies on few example motions in order to guide the generation of complex movements that replicate human-like characteristics.

The proposed framework is based on three major components. The first one consists of a fast locomotion module that generates controllable models from single motion examples, and is capable of independent control of direction, orientation and velocity, with known coverage and quality characteristics. The approach is computationally efficient and achieves high controllability of stepping behaviors, thus addressing key properties for supporting a variety of whole-body manipulation tasks.

The second component relies on the proposed locomotion controller applied to multiple locomotion behaviors in order to plan multi-behavior paths around obstacles. A new locomotion planning approach is proposed where the behavioral capabilities of the character are considered during the path planning stage, in order to address trade-offs related to path length and preferred navigation behavior when selecting narrow passages to take. The approach relies on new types of operations with planar navigation meshes, reaching fast execution times suitable for real-time applications.

The last component focuses on the coordination between locomotion and upper-body manipulation. The proposed approach is based on learning spatial coordination features from example motions and on associating body-environment proximity information to the body configurations of each example motion. Body configurations then become the input to a regression system which in turn is able to generate new interactions for different situations in similar environments. The regression model is capable of encoding and replicating key spatial strategies with respect to body coordination and management of environment constraints. Obtained results successfully synthesize complex full-body actions such as opening doors and drawing in a wide whiteboard.

The models proposed in this dissertation achieve new interactive controllers able to synthesize coordinated full-body motions for a variety of complex interactions requiring body mobility and manipulation.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View