Conference paper Open Access

Asynchronous Event-based Line Tracking for Time-to-Contact Maneuvers in UAS

A. Gomez Eguiluz; J. P. Rodríguez-Gómez,; J .R. Martínez-de Dios,; A. Ollero

This paper presents an bio-inspired event-based perception scheme for agile aerial robot maneuvering. It tries to mimic birds, which perform purposeful maneuvers by closing the separation in the retinal image (w.r.t. the goal) to follow time-to-contact trajectories. The proposed approach is based on event cameras, also called artificial retinas, which provide fast response and robustness against motion blur and lighting conditions. Our scheme guides the robot by only adjusting the position of features extracted in the event image plane to their goal positions at a predefined time using smooth time-to-contact trajectories. The proposed scheme is robust, efficient and can be added on top of commonly-used aerial robot velocity controllers. It has been validated on-board a UAV with real-time computation in low-cost hardware during sets of experiments with different descent maneuvers and lighting conditions.

This work was supported by the ARM-EXTEND (DPI2017-8979-R) project funded by the Spanish National R&D Plan.
Files (6.0 MB)
Name Size
IROS2020___Event_based_visual_servoying.pdf
md5:2ec3592889ff999dd3db2f47c3c162cd
6.0 MB Download
84
74
views
downloads
All versions This version
Views 8416
Downloads 7417
Data volume 440.8 MB101.3 MB
Unique views 5212
Unique downloads 6312

Share

Cite as