Skip to main content
eScholarship
Open Access Publications from the University of California

UC San Diego

UC San Diego Electronic Theses and Dissertations bannerUC San Diego

Empowering Conservation through Deep Convolutional Neural Networks and Unmanned Aerial Systems

Abstract

Tropical rainforests worldwide are negatively impacted from a variety of human-caused threats. Unfortunately, our ability to study these rainforests is impeded by logistical problems such as their physical inaccessibility, expensive aerial imagery, and/or coarse satellite data. One solution is the use of low-cost, Unmanned Aerial Vehicles (UAV), commonly referred to as drones. Drones are now widely recognized as a tool for ecology, environmental science, and conservation, collecting imagery that is superior to satellite data in resolution. We asked: Can we take advantage of the sub-meter, high-resolution imagery to detect specific tree species or groups, and use these data as indicators of rainforest functional traits and characteristics? We demonstrate a low-cost method for obtaining high-resolution aerial imagery in a rainforest of Belize using a drone over three sites in two rainforest protected areas. We built a workflow that uses Structure from Motion (SfM) on the drone images to create a large orthomosaic and a Deep Convolutional Neural Network (CNN) to classify indicator tree species. We selected: 1) Cohune Palm (Attalea cohune) as they are indicative of past disturbance and current soil condition; and, 2) the dry-season deciduous tree group since deciduousness is an important ecological factor of rainforest structure and function. This framework serves as a guide for tackling difficult ecological challenges and we show two additionally examples of how a similar architecture can help count wildlife populations in the Arctic.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View