Dataset Open Access

Dakar very-high resolution land cover map

Tais Grippa; Stefanos Georganos


Citation Style Language JSON Export

{
  "publisher": "Zenodo", 
  "DOI": "10.5281/zenodo.1290800", 
  "title": "Dakar very-high resolution land cover map", 
  "issued": {
    "date-parts": [
      [
        2018, 
        6, 
        15
      ]
    ]
  }, 
  "abstract": "<p>This land cover map of Dakar (Senegal) was created from a Pl&eacute;iades very-high resolution imagery with a spatial resolution of 0.5 meter. The methodology followed a open-source semi-automated framework [1] that rely on <a href=\"https://grass.osgeo.org/\">GRASS GIS</a>&nbsp;using a local unsupervised optimization approach for the segmentation part [2-3].</p>\n\n<p>Description of the files:</p>\n\n<ul>\n\t<li>&quot;Landcover.zip&quot; :&nbsp;The direct output from the supervised classification using the Random Forest classifier.</li>\n\t<li>&quot;Landcover_Postclassif_Level8_Splitbuildings.zip&quot; : Post-processed version of the previous map (&quot;Landcover&quot;), with reduced misclassifications from the original classification (rule-based used to reclassify&nbsp;the errors, with a focus on built-up classes).</li>\n\t<li>&quot;Landcover_Postclassif_Level8_modalfilter3.zip&quot; : Smoothed version of the previous product (modal filter with window 3x3 applied on the &quot;Landcover_Postclassif_Level8_Splitbuildings&quot;).&nbsp;</li>\n\t<li>&quot;Landcover_Postclassif_Level9_Shadowsback.zip&quot; : Corresponds to the &quot;level8_Splitbuildings&quot; with shadows coming&nbsp;from the original classification.</li>\n\t<li>&quot;Dakar_legend_colors.txt&quot; : Text file providing the&nbsp;correspondance between the value of the pixels and the legend labels and a proposition of color to be used.</li>\n</ul>\n\n<p>&nbsp;</p>\n\n<p>References:</p>\n\n<p>[1]&nbsp;Grippa, Ta&iuml;s, Moritz Lennert, Benjamin Beaumont, Sabine Vanhuysse, Nathalie Stephenne, and El&eacute;onore Wolff. 2017. &ldquo;An Open-Source Semi-Automated Processing Chain for Urban Object-Based Classification.&rdquo; <em>Remote Sensing</em> 9 (4): 358. <a href=\"https://doi.org/10.3390/rs9040358\">https://doi.org/10.3390/rs9040358</a>.</p>\n\n<p>[2]&nbsp;Grippa, Tais, Stefanos Georganos, Sabine G. Vanhuysse, Moritz Lennert, and El&eacute;onore Wolff. 2017. &ldquo;A Local Segmentation Parameter Optimization Approach for Mapping Heterogeneous Urban Environments Using VHR Imagery.&rdquo; In <em>Proceedings Volume 10431, Remote Sensing Technologies and Applications in Urban Environments II.</em>, edited by Wieke Heldens, Nektarios Chrysoulakis, Thilo Erbertseder, and Ying Zhang, 20. SPIE. <a href=\"https://doi.org/10.1117/12.2278422\">https://doi.org/10.1117/12.2278422</a>.</p>\n\n<p>[3]&nbsp;Georganos, Stefanos, Ta&iuml;s Grippa, Moritz Lennert, Sabine Vanhuysse, and Eleonore Wolff. 2017. &ldquo;SPUSPO: Spatially Partitioned Unsupervised Segmentation Parameter Optimization for Efficiently Segmenting Large Heterogeneous Areas.&rdquo; In <em>Proceedings of the 2017 Conference on Big Data from Space (BiDS&rsquo;17)</em>.</p>\n\n<p>&nbsp;</p>\n\n<p>Founding:&nbsp;</p>\n\n<p>This dataset was&nbsp;produced in the frame of two research project : MAUPP (<a href=\"http://maupp.ulb.ac.be\">http://maupp.ulb.ac.be</a>)&nbsp;and REACT (<a href=\"http://react.ulb.be\">http://react.ulb.be</a>), funded by the&nbsp;Belgian Federal Science Policy Office (<a href=\"http://eo.belspo.be/About/Stereo3.aspx\">BELSPO</a>).</p>", 
  "author": [
    {
      "family": "Tais Grippa"
    }, 
    {
      "family": "Stefanos Georganos"
    }
  ], 
  "note": "The production of this dataset was founded by BELSPO (Belgian Federal Science Policy Office) in the frame of the\nSTEREO III program, as part of the MAUPP (SR/00/304) and REACT (SR/00/337) project (http://maupp.ulb.ac.be\nand http://react.ulb.be/).", 
  "version": "V1.0", 
  "type": "dataset", 
  "id": "1290800"
}
507
117
views
downloads
All versions This version
Views 507507
Downloads 117117
Data volume 11.0 GB11.0 GB
Unique views 488488
Unique downloads 5151

Share

Cite as