Published March 26, 2025 | Version 1
Dataset Open

Drone onboard multi-modal sensor and feature-based visual odometry dataset for complex outdoor scenarios

  • 1. ROR icon KIOS Research and Innovation Center of Excellence
  • 2. ROR icon University of Cyprus

Description

The data acquisition missions were planned and carried out using the flight route planning feature of DJI Pilot 2. These missions followed five distinct geometric patterns: 1. linear-triangular, 2. circular, 3. rectangular, 4. linear, and 5. multi-dimensional. Each mission was executed as a waypoint-based flight, enabling control over parameters such as altitude, speed, and turning angles at each waypoint. The dataset captures 3D flight dynamics, including take-off, landing, and altitude variations to reflect changes along the z-axis. Data was recorded at a fixed frequency of 10 Hz.

To ensure consistency throughout the dataset, identical parameters were maintained across all data acquisition missions. The dataset encompasses 18 unique drone flights, with each trajectory repeated multiple times, totaling approximately 15 minutes of flight time per mission. Structured as time-series data, each flight is associated with a unique flight identifier and timestamp.

The drone's spatial coordinates are captured by position_x, position_y, and position_z, while its orientation is described by the quaternion components orientation_x, orientation_y, orientation_z, and orientation_w. The translational and angular dynamics of the drone are recorded through velocity_x, velocity_y, velocity_z and angular_x, angular_y, angular_z, respectively. Linear acceleration is provided by linear_acceleration_x, linear_acceleration_y, and linear_acceleration_z.

Environmental conditions, including wind_speed and wind_angle, were recorded using the TriSonica Mini Wind and Weather Sensor. Additionally, drone power system metrics such as battery_voltage and battery_current are included.

The dataset further integrates flight control and estimation data. Parameters such as escSpeed and escVoltage offer insight into the motor activity, while d_roll, d_pitch, and d_yaw represent the drone's control input rates. Orientation-related data are complemented by heading and yaw. Moreover, Sensor_Fusion_lon and Sensor_Fusion_lat provide fused GPS-based positional estimates derived from sensor fusion algorithms.

Data Acquisition Paths: DataAcquisitionMissions

The dataset contains labels indicating various operational states of the drone, including IDLE_HOVER, ASCEND, TURN, HMSL, and DESCEND. These labels are useful for identifying and classifying the drone's behavior during flight. Additionally, the annotated data can support multi-task learning applications, such as trajectory prediction.

Data collection is conducted using the DJI Matrice 300 RTK, selected for its compatibility with onboard development kits, which allows seamless access to sensor and flight controller data. The NVIDIA Jetson Xavier NX is employed as the onboard computing unit to run the developed software. By leveraging DJI's Onboard SDK, the Jetson device enables real-time data retrieval and processing from the drone’s onboard systems.

Files

Drone Onboard Multi-Modal Feature-Based Visual Odometry Dataset.csv

Files (47.5 MB)

Additional details

Related works

Continues
Dataset: 10.5281/zenodo.13682870 (DOI)

Funding

European Commission
KIOS CoE - KIOS Research and Innovation Centre of Excellence 739551

Dates

Available
2025-03-26