Dataset Open Access

NEON Tree Crowns Dataset

Ben Weinstein; Sergio Marconi; Alina Zare; Stephanie Bohlman; Sarah Graves; Aditya Singh; Ethan White

Dublin Core Export

<?xml version='1.0' encoding='utf-8'?>
<oai_dc:dc xmlns:dc="" xmlns:oai_dc="" xmlns:xsi="" xsi:schemaLocation="">
  <dc:creator>Ben Weinstein</dc:creator>
  <dc:creator>Sergio Marconi</dc:creator>
  <dc:creator>Alina Zare</dc:creator>
  <dc:creator>Stephanie Bohlman</dc:creator>
  <dc:creator>Sarah Graves</dc:creator>
  <dc:creator>Aditya Singh</dc:creator>
  <dc:creator>Ethan White</dc:creator>

The NeonTreeCrowns dataset is a set of individual level crown estimates for 100 million trees at 37 geographic sites across the United States surveyed by the National Ecological Observation Network’s Airborne Observation Platform. Each rectangular bounding box crown prediction includes height, crown area, and spatial location. 

How can I see the data?

A web server to look through predictions is available through

Dataset Organization

The contains 11,000 shapefiles, each corresponding to a 1km^2 RGB tile from NEON (ID: DP3.30010.001). For example "2019_SOAP_4_302000_4100000_image.shp" are the predictions from "2019_SOAP_4_302000_4100000_image.tif" available from the NEON data portal: NEON's file convention refers to the year of data collection (2019), the four letter site code (SOAP), the sampling event (4), and the utm coordinate of the top left corner (302000_4100000). For NEON site abbreviations and utm zones see 

The predictions are also available as a single csv for each file. All available tiles for that site and year are combined into one large site. These data are not projected, but contain the utm coordinates for each bounding box (left, bottom, right, top). For both file types the following fields are available:

Height: The crown height measured in meters. Crown height is defined as the 99th quartile of all canopy height pixels from a LiDAR height model (ID: DP3.30015.001)

Area: The crown area in m2 of the rectangular bounding box.

Label: All data in this release are "Tree".

Score: The confidence score from the DeepForest deep learning algorithm. The score ranges from 0 (low confidence) to 1 (high confidence)

How were predictions made?

The DeepForest algorithm is available as a python package: Predictions were overlaid on the LiDAR-derived canopy height model. Predictions with heights less than 3m were removed.

How were predictions validated?

Please see

Weinstein, B. G., Marconi, S., Bohlman, S. A., Zare, A., &amp; White, E. P. (2020). Cross-site learning in deep learning RGB tree crown detection. Ecological Informatics, 56, 101061.

Weinstein, B., Marconi, S., Aubry-Kientz, M., Vincent, G., Senyondo, H., &amp; White, E. (2020). DeepForest: A Python package for RGB deep learning tree crown delineation. bioRxiv.

Weinstein, Ben G., et al. "Individual tree-crown detection in RGB imagery using semi-supervised deep learning neural networks." Remote Sensing 11.11 (2019): 1309.

Were any sites removed?

Several sites were removed due to poor NEON data quality. GRSM and PUUM both had lower quality RGB data that made them unsuitable for prediction. NEON surveys are updated annually and we expect future flights to correct these errors. We removed the GUIL puerto rico site due to its very steep topography and poor sunangle during data collection. The DeepForest algorithm responded poorly to predicting crowns in intensely shaded areas where there was very little sun penetration. We are happy to make these data are available upon request.

# Contact

We welcome questions, ideas and general inquiries. The data can be used for many applications and we look forward to hearing from you. Contact </dc:description>
  <dc:description>Gordon and Betty Moore Foundation: GBMF4563</dc:description>
  <dc:subject>Deep Learning</dc:subject>
  <dc:subject>Remote Sensing</dc:subject>
  <dc:title>NEON Tree Crowns Dataset</dc:title>
All versions This version
Views 3,9743,969
Downloads 4,8854,889
Data volume 2.8 TB2.8 TB
Unique views 3,3483,344
Unique downloads 2,8872,890


Cite as