Journal article Open Access

The GRIFFIN Perception Dataset: Bridging the Gap Between Flapping-Wing Flight and Robotic Perception

J. P. Rodríguez Gómez; R. Tapia; J. L. Paneque; P. Grau; A. Gómez Eguíluz; J. R. Martínez-de Dios; A. Ollero


Dublin Core Export

<?xml version='1.0' encoding='utf-8'?>
<oai_dc:dc xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:oai_dc="http://www.openarchives.org/OAI/2.0/oai_dc/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd">
  <dc:creator>J. P. Rodríguez Gómez</dc:creator>
  <dc:creator>R. Tapia</dc:creator>
  <dc:creator>J. L. Paneque</dc:creator>
  <dc:creator>P. Grau</dc:creator>
  <dc:creator>A. Gómez Eguíluz</dc:creator>
  <dc:creator>J. R. Martínez-de Dios</dc:creator>
  <dc:creator>A. Ollero</dc:creator>
  <dc:date>2021-02-02</dc:date>
  <dc:description>The development of automatic perception systems and techniques for bio-inspired flapping-wing robots is severely hampered by the high technical complexity of these platforms and the installation of onboard sensors and electronics. Besides, flapping-wing robot perception suffers from high vibration levels and abrupt movements during flight, which cause motion blur and strong changes in lighting conditions. This letter presents a perception dataset for bird-scale flapping-wing robots as a tool to help alleviate the aforementioned problems. The presented data include measurements from onboard sensors widely used in aerial robotics and suitable to deal with the perception challenges of flapping-wing robots, such as an event camera, a conventional camera, and two Inertial Measurement Units (IMUs), as well as ground truth measurements from a laser tracker or a motion capture system. A total of 21 datasets of different types of flights were collected in three different scenarios (one indoor and two outdoor). To the best of the authors' knowledge this is the first dataset for flapping-wing robot perception.</dc:description>
  <dc:description>Link to the Dataset:
https://grvc.us.es/eye-bird-dataset/</dc:description>
  <dc:identifier>https://zenodo.org/record/4916361</dc:identifier>
  <dc:identifier>10.1109/LRA.2021.3056348</dc:identifier>
  <dc:identifier>oai:zenodo.org:4916361</dc:identifier>
  <dc:relation>info:eu-repo/grantAgreement/EC/H2020/788247/</dc:relation>
  <dc:relation>info:eu-repo/grantAgreement/EC/H2020/871479/</dc:relation>
  <dc:rights>info:eu-repo/semantics/openAccess</dc:rights>
  <dc:rights>https://creativecommons.org/licenses/by/4.0/legalcode</dc:rights>
  <dc:source>IEEE Robotics and Automation Letters 6(2) 1066-1073</dc:source>
  <dc:subject>Data sets for robotic vision,  vision-based navigation,  aerial systems: Perception and autonomy ,  flapping-wing robots ,  event-based cameras</dc:subject>
  <dc:title>The GRIFFIN Perception Dataset: Bridging the Gap Between Flapping-Wing Flight and Robotic Perception</dc:title>
  <dc:type>info:eu-repo/semantics/article</dc:type>
  <dc:type>publication-article</dc:type>
</oai_dc:dc>
41
80
views
downloads
Views 41
Downloads 80
Data volume 1.1 GB
Unique views 33
Unique downloads 75

Share

Cite as