Gomez Eguiluz, Augusto
Rodríguez Gómez, Juan Pablo
Martínez de Dios, José Ramiro
Ollero, Anibal
2021-01-23
<p>This paper presents an bio-inspired event-based perception scheme for agile aerial robot maneuvering. It tries to mimic birds, which perform purposeful maneuvers by closing the separation in the retinal image (w.r.t. the goal) to follow time-to-contact trajectories. The proposed approach is based on event cameras, also called artificial retinas, which provide fast response and robustness against motion blur and lighting conditions. Our scheme guides the robot by only adjusting the position of features extracted in the event image plane to their goal positions at a predefined time using smooth time-to-contact trajectories. The proposed scheme is robust, efficient and can be added on top of commonly-used aerial robot velocity controllers. It has been validated on-board a UAV with real-time computation in low-cost hardware during sets of experiments with different descent maneuvers and lighting conditions.</p>
https://doi.org/10.5281/zenodo.4459506
oai:zenodo.org:4459506
eng
Zenodo
https://zenodo.org/communities/eu
https://doi.org/10.5281/zenodo.4459505
info:eu-repo/semantics/openAccess
Creative Commons Attribution 4.0 International
https://creativecommons.org/licenses/by/4.0/legalcode
IROS, IEEE/RSJ International Conference on Intelligent Robots and Systems, Las Vegas, 25 November 2021
event camera
aerial robots
perception systems
time-to-contact
tau theory
event-based vision
line tracker
visual servoing
Asynchronous Event-based Line Tracking for Time-to-Contact Maneuvers in UAS
info:eu-repo/semantics/conferencePaper