Seeing the Forest and the Trees


Unmanned Aerial Systems
for
Forest Management

by: Karina Puikkonen
Published July 6, 2021
Drone Footage courtesy of Neal Swayze and Matthew Creasy

Airborne Light Detection and Ranging, called LiDAR, has used aircraft lasers to map large forested areas for over 30 years, but high costs and lack of color images have made it a less efficient resource for describing tree variation at smaller scales.

Since 2010, unmanned aerial systems (UAS) have emerged as an alternate technology that can provide cost-effective, high-resolution images for mapping local forest areas. While unmanned aerial vehicles, or drones, have been used in forests over the past several years, basic questions about their efficacy in forest management have gone unanswered until now.

A team led by Colorado State University researchers spent the past three years testing UAS in Western ponderosa pine forests. A series of three new research articles recently published in Forests, Canadian Journal of Forest Research, and Remote Sensing of Environment tackle foundational aspects for using UAS specifically as a spatial forest monitoring and management tool. Wade Tinkham, the project lead, said the team’s collective results offer consistent parameters to improve forest management decision-making processes.

“Managers need a spatially explicit way to monitor individual trees,” explained Tinkham, an assistant professor in the Department of Forest and Rangeland Stewardship in the Warner College of Natural Resources. “Providing higher-quality data about these ecological spaces is starting to matter more and more in management decisions.”

Forest technicians typically collect tree measurements on the ground with hand tools and instruments designed for these specific purposes. The CSU research team found that individual tree metrics can be reliably determined over large forest stands with UAS, greatly expanding what can be monitored in a forest.

Photo credits: Karina Puikkonen

“Foresters walk through a forest deciding what trees to remove or keep, but they can only see 20 to 30 trees at a time,” Tinkham said. “With UAS, we found that we can see nearly every tree to inform these decisions.”

These decisions ultimately affect management actions designed to benefit various forest ecosystem services such as promoting wildlife habitat, watershed protection, and natural fire regimes.

Processing parameters for UAS data extraction

The research team first determined the software parameters necessary for maximizing their final product: a realistic digital manifestation of the three-dimensional forest from which individual tree measurements could be assessed. These results are summarized in Forests.

“There are a thousand ways to do this, and we needed to know what data capture and processing methods best represent the forest,” Tinkham said.

The team developed a basic workflow for using UAS in forests. The workflow consisted of four components: data collection, digital tree extraction, data filtering and modeling, and analysis and validation of model output. This translates to capturing thousands of images with UAVs and transforming them into data points collectively reconstructed into forest renderings using Structure from Motion (SfM) photogrammetry.

Neal Swayze, one of the study’s co-authors, said SfM uses the camera’s properties and different viewpoints to solve for the three-dimensional location of pixels, which in this case make up individual trees.

“Structure from Motion is like hundreds of eyes looking at something from above that create a multi-angled perspective of that object,” said Swayze, who recently earned his master’s degree from the Warner College. “Computer learning detects points in all those images and stitches the matching points together into a rough approximation of the picture. Structure from Motion is what our eyes and brain do all the time.”

Digital forest rendering courtesy of Neal Swayze

The team conducted UAS test flights and ran the collection of images through various software settings to compare the completeness of each forest rendering outcome.

“We found that the highest possible settings produced the best results,” Swayze said.

When considering the efficiency of UAS data processing, Swayze said the time to process 1.5 acres of tree data at the highest settings required 1.5 hours of time. But the team also found that the second-best settings still provided over 90% accuracy while reducing the processing time by more than half.

UAS accuracy for tree detection

The Canadian Journal of Forest Research study evaluated whether airborne LiDAR individual tree detection methods that have provided useful approximations in conifer-dominated systems could be accurately applied to the SfM point cloud renderings. This required understanding how horizontal and vertical forest structure would impact the quality of tree detection.

Determining horizontal forest structure involves looking down on a forest, viewing from above the number of trees and the distances between them. Vertical structure involves looking through a grouping of large and small trees to determine what wildlife may use in an area or reveal potential wildfire paths. Together, these viewpoints ensure both the overstory of mature trees and the understory of juvenile trees are accounted for.

“This collective spatial variability shows how individual trees and groups of trees promote ecological processes in the forest,” Tinkham said.

Matthew Creasy, the study’s lead author, collected forest images with UAS and processed them through the workflow the team defined earlier. Creasy and Swayze also mapped nearly 5,000 trees in the Kaibab National Forest in northern Arizona and in the Manitou Experimental Forest in central Colorado by hand. This helped the team evaluate several algorithms to determine the actual accuracy of detecting a tree’s height and location.

The overall accuracy for UAS tree detection was 78% at Kaibab and 68% at Manitou. But when looking at the overstory only, tree detection exceeded expectations with greater than 93% of trees being identified in these forests.

While the UAS data provided reliable detections of mature trees and intermediate-sized tree locations and heights, it underpredicted individuals in the understory where small trees in close proximity were identified as a single tree. Tinkham said these results still describe the relative variation in forest structure across size classes, providing valuable information for developing forest management treatments.

Videography by Karina Puikkonen


Maximizing UAS data capture

Once the team determined the software program’s capability and accuracy, their study in Remote Sensing of Environment tested different UAS flight approaches to see how these impacted the ability to determine digital tree measurements. Swayze flew 30 UAS test flights at the Manitou Experimental Forest using various combinations of three altitudes, two flight patterns and five camera orientations.

“We found that UAS acquisitions can be optimized for data sets to provide nearly complete tree lists for individual heights, locations, and DBH (diameters at breast height),” Swayze said. “We can now say that flying a certain height, pattern, and overlap will get the best data in dry conifer ecosystems.”

The team determined that using angled, overlapping north-south and east-west flight lines at lower altitudes maximized the accuracy and correlation of overall forest measurements calculated from the individual tree metrics. Swayze said these crosshatch patterns double the amount of data collected, which needs to be factored into the data acquisition and processing time.

Videography by Karina Puikkonen

The team again compared their digital measurements with ground measurements taken for nearly 1,000 trees at the site. The results showed that the cross-sectional area captured from UAS was within 5-10% of the true measurements. This suggested that UAS flights could reliably collect comparable data with limited need for field observations.

“Although not all trees were successfully extracted from the UAS data, these methods capture the relative local trends and stand-level averages that are necessary for informing a broad range of thinning and restoration actions in dry conifer forests,” Tinkham said.

By experimenting with different ways to assemble and use the wealth of information UAS technology provides, this research team has offered an UAS-based road map for forest managers. Monitoring ponderosa pine forests from both the ground and the air expands our understanding of how these ecosystems have evolved, and how forest structure and function may adapt to a hotter and drier climate.