Conference paper Open Access

A First-Person Database for Detecting Barriers for Pedestrians

Zenonas Theodosiou; Harris Partaourides; Tolga Atun; Simoni Panayi; Andreas Lanitis


MARC21 XML Export

<?xml version='1.0' encoding='UTF-8'?>
<record xmlns="http://www.loc.gov/MARC21/slim">
  <leader>00000nam##2200000uu#4500</leader>
  <datafield tag="041" ind1=" " ind2=" ">
    <subfield code="a">eng</subfield>
  </datafield>
  <controlfield tag="005">20200730125841.0</controlfield>
  <datafield tag="500" ind1=" " ind2=" ">
    <subfield code="a">This work has received funding from the European Union's Horizon 2020 Research and Innovation Programme under Grant Agreement  No 739578 and the Government of the Republic of Cyprus through the Directorate General for European Programmes, Coordination and Development.</subfield>
  </datafield>
  <controlfield tag="001">3747579</controlfield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">Research Centre on Interactive Media Smart Systems and Emerging Technologies, Nicosia, Cyprus</subfield>
    <subfield code="a">Harris Partaourides</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">Research Centre on Interactive Media, Smart Systems and Emerging Technologies (Nicosia, CYPRUS)</subfield>
    <subfield code="a">Tolga Atun</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">Research Centre on Interactive Media, Smart Systems and Emerging Technologies (Nicosia, CYPRUS)</subfield>
    <subfield code="a">Simoni Panayi</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">Research Centre on Interactive Media, Smart Systems and Emerging Technologies (Nicosia, CYPRUS) and  Department of Multimedia and Graphic Arts, Cyprus University of Technology, Limassol, Cyprus</subfield>
    <subfield code="a">Andreas Lanitis</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">2944998</subfield>
    <subfield code="z">md5:efcf2c1b5b3c388200bcf330fc884e1d</subfield>
    <subfield code="u">https://zenodo.org/record/3747579/files/VISAPP2020.pdf</subfield>
  </datafield>
  <datafield tag="542" ind1=" " ind2=" ">
    <subfield code="l">open</subfield>
  </datafield>
  <datafield tag="260" ind1=" " ind2=" ">
    <subfield code="c">2020-04-10</subfield>
  </datafield>
  <datafield tag="909" ind1="C" ind2="O">
    <subfield code="p">openaire</subfield>
    <subfield code="p">user-rise-teaming-cyprus</subfield>
    <subfield code="o">oai:zenodo.org:3747579</subfield>
  </datafield>
  <datafield tag="100" ind1=" " ind2=" ">
    <subfield code="u">Research Centre on Interactive Media Smart Systems and Emerging Technologies, Nicosia, Cyprus</subfield>
    <subfield code="a">Zenonas Theodosiou</subfield>
  </datafield>
  <datafield tag="245" ind1=" " ind2=" ">
    <subfield code="a">A First-Person Database for Detecting Barriers for Pedestrians</subfield>
  </datafield>
  <datafield tag="980" ind1=" " ind2=" ">
    <subfield code="a">user-rise-teaming-cyprus</subfield>
  </datafield>
  <datafield tag="536" ind1=" " ind2=" ">
    <subfield code="c">739578</subfield>
    <subfield code="a">Research Center on Interactive Media, Smart System and Emerging Technologies</subfield>
  </datafield>
  <datafield tag="540" ind1=" " ind2=" ">
    <subfield code="u">https://creativecommons.org/licenses/by-nc-nd/4.0/legalcode</subfield>
    <subfield code="a">Creative Commons Attribution Non Commercial No Derivatives 4.0 International</subfield>
  </datafield>
  <datafield tag="650" ind1="1" ind2="7">
    <subfield code="a">cc-by</subfield>
    <subfield code="2">opendefinition.org</subfield>
  </datafield>
  <datafield tag="520" ind1=" " ind2=" ">
    <subfield code="a">&lt;p&gt;increasingly being utilized in several applications to enhance the quality of citizens&amp;rsquo; life, especially for those with visual or motion impairments. The development of sophisticated egocentric computer vision techniques requires automatic analysis of large databases of first-person point of view visual data collected through wearable devices. In this paper, we present our initial findings regarding the use of wearable cameras for enhancing the pedestrians&amp;rsquo; safety while walking in city sidewalks. For this purpose, we create a first-person database that entails annotations on common barriers that may put pedestrians in danger. Furthermore, we derive a framework for collecting visual lifelogging data and define 24 different categories of sidewalk barriers. Our dataset consists of 1796 annotated images covering 1969 instances of barriers. The analysis of the dataset by means of object classification algorithms, depict encouraging results for further study.&lt;/p&gt;</subfield>
  </datafield>
  <datafield tag="024" ind1=" " ind2=" ">
    <subfield code="a">10.5220/0009107506600666</subfield>
    <subfield code="2">doi</subfield>
  </datafield>
  <datafield tag="980" ind1=" " ind2=" ">
    <subfield code="a">publication</subfield>
    <subfield code="b">conferencepaper</subfield>
  </datafield>
</record>
14
16
views
downloads
Views 14
Downloads 16
Data volume 47.1 MB
Unique views 13
Unique downloads 15

Share

Cite as