Conference paper Open Access

Border surveillance using computer vision enabled robotic swarms for semantically enriched situational awareness

George Orfanidis; Savvas Apostolidis; George Prountzos; Marina Riga; Athanasios Kapoutsis; Konstantinos Ioannidis; Elias Kosmatopoulos; Stefanos Vrochidis; Ioannis Kompatsiaris


MARC21 XML Export

<?xml version='1.0' encoding='UTF-8'?>
<record xmlns="http://www.loc.gov/MARC21/slim">
  <leader>00000nam##2200000uu#4500</leader>
  <datafield tag="041" ind1=" " ind2=" ">
    <subfield code="a">eng</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">Border surveillance</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">Autonomous systems</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">Visual detection</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">Semantic representations</subfield>
  </datafield>
  <controlfield tag="005">20200728005924.0</controlfield>
  <controlfield tag="001">3961453</controlfield>
  <datafield tag="711" ind1=" " ind2=" ">
    <subfield code="d">29-31 October 2019</subfield>
    <subfield code="g">MSE2019</subfield>
    <subfield code="a">Mediterranean Security Event 2019</subfield>
    <subfield code="c">Heraklion, Crete, Greece</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">CERTH</subfield>
    <subfield code="a">Savvas Apostolidis</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">CERTH</subfield>
    <subfield code="a">George Prountzos</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">CERTH</subfield>
    <subfield code="a">Marina Riga</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">CERTH</subfield>
    <subfield code="a">Athanasios Kapoutsis</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">CERTH</subfield>
    <subfield code="a">Konstantinos Ioannidis</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">CERTH</subfield>
    <subfield code="a">Elias Kosmatopoulos</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">CERTH</subfield>
    <subfield code="a">Stefanos Vrochidis</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">CERTH</subfield>
    <subfield code="a">Ioannis Kompatsiaris</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">3437240</subfield>
    <subfield code="z">md5:52413bec39e40ef142ea895250f54afb</subfield>
    <subfield code="u">https://zenodo.org/record/3961453/files/ROBORDER_MSE2019_submitted.pdf</subfield>
  </datafield>
  <datafield tag="542" ind1=" " ind2=" ">
    <subfield code="l">open</subfield>
  </datafield>
  <datafield tag="260" ind1=" " ind2=" ">
    <subfield code="c">2019-10-29</subfield>
  </datafield>
  <datafield tag="909" ind1="C" ind2="O">
    <subfield code="p">openaire</subfield>
    <subfield code="o">oai:zenodo.org:3961453</subfield>
  </datafield>
  <datafield tag="100" ind1=" " ind2=" ">
    <subfield code="u">CERTH</subfield>
    <subfield code="a">George Orfanidis</subfield>
  </datafield>
  <datafield tag="245" ind1=" " ind2=" ">
    <subfield code="a">Border surveillance using computer vision enabled robotic swarms for semantically enriched situational awareness</subfield>
  </datafield>
  <datafield tag="536" ind1=" " ind2=" ">
    <subfield code="c">740593</subfield>
    <subfield code="a">autonomous swarm of heterogeneous RObots for BORDER surveillance</subfield>
  </datafield>
  <datafield tag="540" ind1=" " ind2=" ">
    <subfield code="u">https://creativecommons.org/licenses/by/4.0/legalcode</subfield>
    <subfield code="a">Creative Commons Attribution 4.0 International</subfield>
  </datafield>
  <datafield tag="650" ind1="1" ind2="7">
    <subfield code="a">cc-by</subfield>
    <subfield code="2">opendefinition.org</subfield>
  </datafield>
  <datafield tag="520" ind1=" " ind2=" ">
    <subfield code="a">&lt;p&gt;Abstract.&lt;br&gt;
Cross-border crime utilizes recent advanced systems to perform their illegal activities. Innovative sensory systems and specialized equipment are examples that were used for tracking of human and of various illicit materials. The increasing challenges that border personnel must resolve require the usage of recent technological advances as&lt;br&gt;
well. Thus, the utilization of pioneer technologies seems imperative to precede technologically organized crime. Towards this objective, the introduction of Unmanned Vehicles (UxV) and the advances of relevant sub-systems has created a new solution to fight cross-border crime. Utilizing a combination of UxVs enriched with enhanced detection capabilities comprises an effective solution. The chapter will introduce and present the capability of an autonomous navigation system by exploiting swarm intelligence principles towards simplifying the overall operation. Computer vision advances and semantically enrichment of the acquired information are incorporated to deliver cutting edge technologies. The described architecture and services can provide a complete solution for optimal border&amp;nbsp; surveillance and increased situation awareness.&lt;/p&gt;</subfield>
  </datafield>
  <datafield tag="773" ind1=" " ind2=" ">
    <subfield code="n">doi</subfield>
    <subfield code="i">isVersionOf</subfield>
    <subfield code="a">10.5281/zenodo.3961452</subfield>
  </datafield>
  <datafield tag="024" ind1=" " ind2=" ">
    <subfield code="a">10.5281/zenodo.3961453</subfield>
    <subfield code="2">doi</subfield>
  </datafield>
  <datafield tag="980" ind1=" " ind2=" ">
    <subfield code="a">publication</subfield>
    <subfield code="b">conferencepaper</subfield>
  </datafield>
</record>
52
52
views
downloads
All versions This version
Views 5252
Downloads 5252
Data volume 178.7 MB178.7 MB
Unique views 4444
Unique downloads 5050

Share

Cite as