Dataset Open Access

Indoor3Dmapping dataset

Armando Arturo Sánchez Alcázar; Giovanni Pintore; Matteo Sgrenzaroli


JSON-LD (schema.org) Export

{
  "inLanguage": {
    "alternateName": "eng", 
    "@type": "Language", 
    "name": "English"
  }, 
  "description": "<p><strong>Data Organization</strong><br>\nUnder the root directory for the whole acquisition, there is a <em>positions.csv</em>&nbsp;file and 3 subdirectories: <em>img</em>, <em>dense</em>, and <em>sparse</em>. The mobile mapping 3D dataset was generated walking around an indoor space and each&nbsp;<strong>&lt;positionID&gt;</strong> corresponds to a unique pose along the trajectory of this motion. This version of the dataset contains a total of 99 unique poses. There is a separation of 1 meter between each adjacent pose.</p>\n\n<pre><code>root\n\u251c\u2500\u2500 img\n\u2502  \u251c\u2500\u2500 &lt;positionID&gt;.jpg\n|  \u2514\u2500\u2500 ...\n\u251c\u2500\u2500 dense\n\u2502  \u251c\u2500\u2500 &lt;positionID&gt;.png\n|  \u2514\u2500\u2500 ...\n\u251c\u2500\u2500 sparse\n\u2502  \u251c\u2500\u2500 &lt;positionID&gt;.png\n|  \u2514\u2500\u2500 ...\n\u2514\u2500\u2500 positions.csv</code></pre>\n\n<p><strong>positions.csv</strong></p>\n\n<ul>\n\t<li>File format: One ASCII file.</li>\n\t<li>File structure Rows: Each image is one record.</li>\n\t<li>File structure Columns: Comma separated headers, with exact order described below.\n\t<ul>\n\t\t<li>Filename, column 0: Panorama file name as on disk, without file extension.</li>\n\t\t<li>Timestamps, column 1: Absolute time at which the panorama was captured, Decimal notation, without thousands separator (microseconds).</li>\n\t\t<li>X,Y,Z, columns 2 through 4: Position of the panoramic camera in decimal notation, without thousands separator (meters).</li>\n\t\t<li>w,x,y,z, columns 5 through 8: Rotation of the camera, quaternion.</li>\n\t</ul>\n\t</li>\n</ul>\n\n<p><strong>sparse</strong></p>\n\n<ul>\n\t<li>Set of equirectangular rendered depth images.</li>\n\t<li>1920x960 resolution</li>\n\t<li>16-bit grayscale PNG</li>\n\t<li>White &rarr; 0 m</li>\n\t<li>Black &rarr; &ge; 16 m or absent geometry</li>\n\t<li>Occlusions: If a pixel was hit by several rays, only the value of the closest one is represented.&nbsp;</li>\n</ul>\n\n<p><strong>dense</strong></p>\n\n<ul>\n\t<li>Set of equirectangular rendered depth images.</li>\n\t<li>1920x960 resolution</li>\n\t<li>16-bit grayscale PNG</li>\n\t<li>White &rarr; 0 m</li>\n\t<li>Black &rarr; &ge; 16 m or absent geometry</li>\n\t<li>Occlusions: If a pixel was hit by several rays, only the value of the closest one is represented.</li>\n</ul>\n\n<p><strong>img</strong><br>\nA set of equirectangular panoramic images was taken with a 360&deg; color camera in 1920x960 resolution. They follow the same trajectory.</p>", 
  "license": "https://creativecommons.org/licenses/by/4.0/legalcode", 
  "creator": [
    {
      "affiliation": "GEXCEL, Italy", 
      "@type": "Person", 
      "name": "Armando Arturo S\u00e1nchez Alc\u00e1zar"
    }, 
    {
      "affiliation": "Visual and Data-Intensive Computing, CRS4, Italy", 
      "@id": "https://orcid.org/0000-0001-8944-1045", 
      "@type": "Person", 
      "name": "Giovanni Pintore"
    }, 
    {
      "affiliation": "GEXCEL, Italy", 
      "@type": "Person", 
      "name": "Matteo Sgrenzaroli"
    }
  ], 
  "url": "https://zenodo.org/record/6367381", 
  "datePublished": "2022-03-18", 
  "version": "0.1.0", 
  "keywords": [
    "point cloud", 
    "indoor", 
    "3D mapping"
  ], 
  "@context": "https://schema.org/", 
  "distribution": [
    {
      "contentUrl": "https://zenodo.org/api/files/e10dae32-b20d-43cf-bfe7-52a82a5c307a/Indoor3Dmapping.zip", 
      "encodingFormat": "zip", 
      "@type": "DataDownload"
    }
  ], 
  "identifier": "https://doi.org/10.5281/zenodo.6367381", 
  "@id": "https://doi.org/10.5281/zenodo.6367381", 
  "@type": "Dataset", 
  "name": "Indoor3Dmapping dataset"
}
128
9
views
downloads
All versions This version
Views 128128
Downloads 99
Data volume 1.1 GB1.1 GB
Unique views 9898
Unique downloads 99

Share

Cite as