Published January 23, 2021 | Version 3
Conference paper Open

Asynchronous Event-based Line Tracking for Time-to-Contact Maneuvers in UAS

Description

This paper presents an bio-inspired event-based perception scheme for agile aerial robot maneuvering. It tries to mimic birds, which perform purposeful maneuvers by closing the separation in the retinal image (w.r.t. the goal) to follow time-to-contact trajectories. The proposed approach is based on event cameras, also called artificial retinas, which provide fast response and robustness against motion blur and lighting conditions. Our scheme guides the robot by only adjusting the position of features extracted in the event image plane to their goal positions at a predefined time using smooth time-to-contact trajectories. The proposed scheme is robust, efficient and can be added on top of commonly-used aerial robot velocity controllers. It has been validated on-board a UAV with real-time computation in low-cost hardware during sets of experiments with different descent maneuvers and lighting conditions.

Notes

This work was supported by the ARM-EXTEND (DPI2017-8979-R) project funded by the Spanish National R&D Plan.

Files

IROS2020___Event_based_visual_servoying.pdf

Files (6.0 MB)

Name Size Download all
md5:2ec3592889ff999dd3db2f47c3c162cd
6.0 MB Preview Download

Additional details

Funding

GRIFFIN – General compliant aerial Robotic manipulation system Integrating Fixed and Flapping wings to INcrease range and safety 788247
European Commission
AERIAL-CORE – AERIAL COgnitive integrated multi-task Robotic system with Extended operation range and safety 871479
European Commission