Dataset Open Access

A Convolutional Neural Network classifier identifies tree species in mixed-conifer forest from hyperspectral imagery

Geoffrey A Fricker; Jonathan Daniel Ventura; Jeffrey Wolf; Malcolm P. North; Frank W. Davis; Janet Franklin


MARC21 XML Export

<?xml version='1.0' encoding='UTF-8'?>
<record xmlns="http://www.loc.gov/MARC21/slim">
  <leader>00000nmm##2200000uu#4500</leader>
  <datafield tag="041" ind1=" " ind2=" ">
    <subfield code="a">eng</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">shapefile</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">tree species</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">convolutional neural network</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">hyperspectral imagery</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">NEON</subfield>
  </datafield>
  <controlfield tag="005">20200124192516.0</controlfield>
  <datafield tag="500" ind1=" " ind2=" ">
    <subfield code="a">Data to replicate the experiment is available for download in two zipped files:
"NEON_D17_TEAK_DP1QA_20170627_181333_RGB_Reflectance.zip" (Imagery)
"CNN_LABELS_2019.zip" (Training Label Shapefiles)

* Note: The imagery is 5.5 gb (zipped).  

All code used to run the analysis is located in a repository here:
https://github.com/jonathanventura/canopy

The only flightline you will need to repeat our results is called "NEON_D17_TEAK_DP1_20170627_181333".  
If you download your own NEON data, the raw HDF 5 files can be converted to a geotiff using R code found here: 
http://neonscience.github.io/neon-data-institute-2016//R/open-NEON-hdf5-functions/

Contact the National Ecological Observatory Network (NEON) to download the comparable imagery data files for all sites and collections: https://data.neonscience.org/home.</subfield>
  </datafield>
  <controlfield tag="001">3470250</controlfield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">Department of Computer Sciences and Software Engineering, California Polytechnic State University, San Luis Obispo</subfield>
    <subfield code="a">Jonathan Daniel Ventura</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">Amazon Corporation</subfield>
    <subfield code="a">Jeffrey Wolf</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">US Forest Service, PSW Research Station</subfield>
    <subfield code="a">Malcolm P. North</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">Bren School of Environmental Science and Management, University of California, Santa Barbara</subfield>
    <subfield code="a">Frank W. Davis</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">Department of Botany and Plant Sciences, University of California, Riverside</subfield>
    <subfield code="a">Janet Franklin</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">302332</subfield>
    <subfield code="z">md5:cd47bc6b75f674a9fde56536cf4f3996</subfield>
    <subfield code="u">https://zenodo.org/record/3470250/files/CNN_LABELS_2019.zip</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">90</subfield>
    <subfield code="z">md5:1c5c4143ea190be96ccd66b370081ae1</subfield>
    <subfield code="u">https://zenodo.org/record/3470250/files/D17_CHM_all.tfw</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">959027084</subfield>
    <subfield code="z">md5:7cd563036f50261f4933fddb1397ce65</subfield>
    <subfield code="u">https://zenodo.org/record/3470250/files/D17_CHM_all.tif</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">2465</subfield>
    <subfield code="z">md5:f6f84f1a7b1142cc8608a5b8ba1efad6</subfield>
    <subfield code="u">https://zenodo.org/record/3470250/files/D17_CHM_all.tif.aux.xml</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">325402188</subfield>
    <subfield code="z">md5:7093ed985e579b6c1c488ed4d4355462</subfield>
    <subfield code="u">https://zenodo.org/record/3470250/files/D17_CHM_all.tif.ovr</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">299410936</subfield>
    <subfield code="z">md5:d83553422da132670e91d94633ffed3a</subfield>
    <subfield code="u">https://zenodo.org/record/3470250/files/D17_CHM_all.zip</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">5</subfield>
    <subfield code="z">md5:ae3b3df9970b49b6523e608759bc957d</subfield>
    <subfield code="u">https://zenodo.org/record/3470250/files/Labels_Trimmed_Selective.CPG</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">11506</subfield>
    <subfield code="z">md5:154608e331495a1d4b78dbbf6178a336</subfield>
    <subfield code="u">https://zenodo.org/record/3470250/files/Labels_Trimmed_Selective.dbf</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">403</subfield>
    <subfield code="z">md5:91faad560efe79cfac796ce48b289b45</subfield>
    <subfield code="u">https://zenodo.org/record/3470250/files/Labels_Trimmed_Selective.prj</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">7204</subfield>
    <subfield code="z">md5:7396a055c52e0498819501a4eac3ee15</subfield>
    <subfield code="u">https://zenodo.org/record/3470250/files/Labels_Trimmed_Selective.sbn</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">532</subfield>
    <subfield code="z">md5:0a30d4dff47b380912b05adb618af470</subfield>
    <subfield code="u">https://zenodo.org/record/3470250/files/Labels_Trimmed_Selective.sbx</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">462124</subfield>
    <subfield code="z">md5:956e78cc3e8e46715cd8b6813627c1ca</subfield>
    <subfield code="u">https://zenodo.org/record/3470250/files/Labels_Trimmed_Selective.shp</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">17810</subfield>
    <subfield code="z">md5:fe8a57215bd3669ae46b2947eb424253</subfield>
    <subfield code="u">https://zenodo.org/record/3470250/files/Labels_Trimmed_Selective.shp.xml</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">5804</subfield>
    <subfield code="z">md5:a9fb286779acee2b88816e29fccefd1d</subfield>
    <subfield code="u">https://zenodo.org/record/3470250/files/Labels_Trimmed_Selective.shx</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">9589226406</subfield>
    <subfield code="z">md5:63ee0d65be0e9acaf7a9d3d4b077231e</subfield>
    <subfield code="u">https://zenodo.org/record/3470250/files/NEON_D17_TEAK_DP1_20170627_181333_reflectance.tif</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">7442</subfield>
    <subfield code="z">md5:406e22158f789ae3670f004d71a83fe1</subfield>
    <subfield code="u">https://zenodo.org/record/3470250/files/NEON_D17_TEAK_DP1_20170627_181333_reflectance.tif.aux.xml</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">639</subfield>
    <subfield code="z">md5:dc0d07b092c3caedb9251657499d1227</subfield>
    <subfield code="u">https://zenodo.org/record/3470250/files/NEON_D17_TEAK_DP1_20170627_181333_reflectance.tif.enp</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">1631284288</subfield>
    <subfield code="z">md5:1a198f7cbc697d13f1e2e8fe320a436e</subfield>
    <subfield code="u">https://zenodo.org/record/3470250/files/NEON_D17_TEAK_DP1_20170627_181333_reflectance.tif.ovr</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">5909087783</subfield>
    <subfield code="z">md5:3a189d128d457e7ddd8bb28b0d131140</subfield>
    <subfield code="u">https://zenodo.org/record/3470250/files/NEON_D17_TEAK_DP1QA_20170627_181333_reflectance.zip</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">135582954</subfield>
    <subfield code="z">md5:dd08b5ce3491536cf865e9383d766dcd</subfield>
    <subfield code="u">https://zenodo.org/record/3470250/files/NEON_D17_TEAK_DP1QA_20170627_181333_RGB_Reflectance.tif</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">8063</subfield>
    <subfield code="z">md5:c269a39838ce83441a09af51e3cf92bc</subfield>
    <subfield code="u">https://zenodo.org/record/3470250/files/NEON_D17_TEAK_DP1QA_20170627_181333_RGB_Reflectance.tif.aux.xml</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">13697238</subfield>
    <subfield code="z">md5:a0a8aa1b7dad91a5fb60f11ed6d72f0c</subfield>
    <subfield code="u">https://zenodo.org/record/3470250/files/NEON_D17_TEAK_DP1QA_20170627_181333_RGB_Reflectance.tif.ovr</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">50207761</subfield>
    <subfield code="z">md5:1b1bf2aa3e618d647491f3c77d0884ae</subfield>
    <subfield code="u">https://zenodo.org/record/3470250/files/NEON_D17_TEAK_DP1QA_20170627_181333_RGB_Reflectance.zip</subfield>
  </datafield>
  <datafield tag="542" ind1=" " ind2=" ">
    <subfield code="l">open</subfield>
  </datafield>
  <datafield tag="260" ind1=" " ind2=" ">
    <subfield code="c">2019-09-27</subfield>
  </datafield>
  <datafield tag="909" ind1="C" ind2="O">
    <subfield code="p">openaire_data</subfield>
    <subfield code="o">oai:zenodo.org:3470250</subfield>
  </datafield>
  <datafield tag="909" ind1="C" ind2="4">
    <subfield code="v">11</subfield>
    <subfield code="p">Remote Sensing</subfield>
    <subfield code="n">19</subfield>
  </datafield>
  <datafield tag="100" ind1=" " ind2=" ">
    <subfield code="u">Social Sciences Department, California Polytechnic State University, San Luis Obispo</subfield>
    <subfield code="a">Geoffrey A Fricker</subfield>
  </datafield>
  <datafield tag="245" ind1=" " ind2=" ">
    <subfield code="a">A Convolutional Neural Network classifier identifies tree species in mixed-conifer forest from hyperspectral imagery</subfield>
  </datafield>
  <datafield tag="540" ind1=" " ind2=" ">
    <subfield code="u">https://creativecommons.org/licenses/by/4.0/legalcode</subfield>
    <subfield code="a">Creative Commons Attribution 4.0 International</subfield>
  </datafield>
  <datafield tag="650" ind1="1" ind2="7">
    <subfield code="a">cc-by</subfield>
    <subfield code="2">opendefinition.org</subfield>
  </datafield>
  <datafield tag="520" ind1=" " ind2=" ">
    <subfield code="a">&lt;p&gt;Published online:&amp;nbsp;&lt;a href="https://www.mdpi.com/2072-4292/11/19/2326"&gt;https://www.mdpi.com/2072-4292/11/19/2326&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;DOI: 10.3390/rs11192326&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Abstract:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In this study, we automate tree species classification and mapping using field-based training data, high spatial resolution airborne hyperspectral imagery, and a convolutional neural network classifier (CNN). We tested our methods by identifying seven dominant trees species as well as dead standing trees in a mixed-conifer forest in the Southern Sierra Nevada Mountains, CA (USA) using training, validation, and testing datasets composed of spatially-explicit transects and plots sampled across a single strip of imaging spectroscopy. We also used a three-band &amp;lsquo;Red-Green-Blue&amp;rsquo; pseudo true-color subset of the hyperspectral imagery strip to test the classification accuracy of a CNN model without the additional non-visible spectral data provided in the hyperspectral imagery. Our classifier is pixel-based rather than object based, although we use three-dimensional structural information from airborne Light Detection and Ranging (LiDAR) to identify trees (points &amp;gt; 5 m above the ground) and the classifier was applied to image pixels that were thus identified as tree crowns. By training a CNN classifier using field data and hyperspectral imagery, we were able to accurately identify tree species and predict their distribution, as well as the distribution of tree mortality, across the landscape. Using a window size of 15 pixels and eight hidden convolutional layers, a CNN model classified the correct species of 713 individual trees from hyperspectral imagery with an average F-score of 0.87 and F-scores ranging from 0.67&amp;ndash;0.95 depending on species. The CNN classification model performance increased from a combined F-score of 0.64 for the Red-Green-Blue model to a combined F-score of 0.87 for the hyperspectral model. The hyperspectral CNN model captures the species composition changes across ~700 meters (1935 to 2630 m) of elevation from a lower-elevation mixed oak conifer forest to a higher-elevation fir-dominated coniferous forest. High resolution tree species maps can support forest ecosystem monitoring and management, and identifying dead trees aids landscape assessment of forest mortality resulting from drought, insects and pathogens. We publicly provide our code to apply deep learning classifiers to tree species identification from geospatial imagery and field training data&lt;/p&gt;

&lt;p&gt;Digital Publication of the training data polygons and hyperspectral imagery used in&amp;nbsp;the manuscript &amp;quot;A Convolutional Neural Network classifier identifies tree species in mixed-conifer forest from hyperspectral imagery&amp;quot;.&lt;/p&gt;

&lt;p&gt;Code is available in a Jupyter Notebook and can be found here:&amp;nbsp;&lt;a href="https://github.com/jonathanventura/canopy"&gt;https://github.com/jonathanventura/canopy&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;National Ecological Observatory Network. 2018. Provisional data downloaded from&amp;nbsp;&lt;a href="http://data.neonscience.org/"&gt;http://data.neonscience.org&lt;/a&gt;&amp;nbsp;on 22&amp;nbsp;June 2018. Battelle, Boulder, CO, USA&lt;/p&gt;</subfield>
  </datafield>
  <datafield tag="773" ind1=" " ind2=" ">
    <subfield code="n">doi</subfield>
    <subfield code="i">isVersionOf</subfield>
    <subfield code="a">10.5281/zenodo.3463588</subfield>
  </datafield>
  <datafield tag="024" ind1=" " ind2=" ">
    <subfield code="a">10.5281/zenodo.3470250</subfield>
    <subfield code="2">doi</subfield>
  </datafield>
  <datafield tag="980" ind1=" " ind2=" ">
    <subfield code="a">dataset</subfield>
  </datafield>
</record>
408
6,510
views
downloads
All versions This version
Views 408293
Downloads 6,510774
Data volume 41.8 TB1.9 TB
Unique views 369270
Unique downloads 695154

Share

Cite as