Published October 17, 2022 | Version 1.1
Dataset Open

SEENIC: dataset for Spacecraft posE Estimation with NeuromorphIC vision

  • 1. The University of Adelaide

Description

Dataset used in the paper "Towards Bridging the Space Domain Gap for Satellite Pose Estimation using Event Sensing" (arXiv, IEEE Xplore), for the purpose of satellite pose estimation with an event camera.

Both events and ground truth camera poses were captured across the 20 scenes in total. There are two trajectories, five lighting configurations and two camera speeds. All combinations of trajectory type, speed and lighting configuration were enumerated for capture. Sample event frames and dataset statistics are available in the paper linked above, along with our pose estimation method used on this dataset.

 

Live-capture scene names use the following encoding: {satellite model}-{trajectory}-{speed}-{lighting configuration}

The calibration scene (calibration.tar.gz) includes multiple views of a chessboard used to calibrate the camera intrinsics and extrinsics for the live-capture scenes. Camera parameters calibrated using this scene can be found in the calib.txt file, with the format: fx fy cx cy k1 k2 p1 p2 k3.

 

All live-capture scenes have the same data format:

scene/

    poses/ -- Raw timestamped robot gripper to base transforms

    cam-poses.csv -- Ground truth camera poses with the format {timestamp, Rx, Ry, Rz, x, y, z}

    events.csv -- Event stream with the format {timestamp, x, y, polarity (0=off, 1=on)}

    meta.json -- Metadata file with camera frame dimensions

Note: all timestamps are in microseconds.

 

The synthetic scene (synthetic.tar.gz) has the following data format:

synthetic/

    poses/ -- Sequential poses captured at a constant time interval

    events.txt -- Event stream with the format: time (float s), x, y, polarity (0=off, 1=on) as specified at https://rpg.ifi.uzh.ch/davis_data.html

    camera_intrinsics.txt -- The camera intrinsic matrix (space separated)

Note: please refer to the paper referenced below for further details on using this synthetic scene.

 

When using the data in an academic context, please cite the following paper.

@INPROCEEDINGS{10160531,
    author={Jawaid, Mohsi and Elms, Ethan and Latif, Yasir and Chin, Tat-Jun},
    booktitle={2023 IEEE International Conference on Robotics and Automation (ICRA)},
    title={Towards Bridging the Space Domain Gap for Satellite Pose Estimation using Event Sensing},
    year={2023},
    volume={},
    number={},
    pages={11866-11873},
    keywords={Adaptation models;Satellites;Pose estimation;Lighting;Robot sensing systems;Robustness;Data models},
    doi={10.1109/ICRA48891.2023.10160531}
}

Files

calib.txt

Files (4.0 GB)

Name Size Download all
md5:2c0d85497394ff3707165e7f778182cc
180 Bytes Preview Download
md5:f2b7c7ee4a14455232e985bfda51675f
586.3 MB Download
md5:570cdde0016467a2561b4ab33d68e781
8.5 MB Download
md5:dab661fd48980923ab84bf4aa69f8c0f
7.4 MB Download
md5:c31dd2d70c01d86c29fc9f2e88925f06
30.4 MB Download
md5:77fb3457ca9e8241b9167692c9405e0d
18.6 MB Download
md5:10567577be8809c2360eee8141515ad3
8.4 MB Download
md5:2c7e95497ebd46cee848a2f0e72aa034
33.0 MB Download
md5:fa22b85fc435f229b9d63b2fb955fda1
30.8 MB Download
md5:03a27878eb8bfb0976cb2cbc6ea8d1b5
169.8 MB Download
md5:dc44c1e367288a359a28ab57bd59e1dd
51.9 MB Download
md5:1a3a6d1a7484652899e4b06028750417
22.0 MB Download
md5:edcd9bcc871e97cef2f78ec981fbf41e
30.1 MB Download
md5:0acfb6c8f7c6c34e965048779e136b76
29.8 MB Download
md5:0e77ba3b865934f7d7071b8e1b12770c
67.3 MB Download
md5:724fc3cbe6eb02e8b49f5d1e7ee2ee3b
51.6 MB Download
md5:405bb99264cb059234f5938f34376209
22.6 MB Download
md5:4c25ad4675ebfa957aeaa3ccc54f998a
203.0 MB Download
md5:6a252bfcb25caf4589f33ab5774729a8
169.4 MB Download
md5:1942ddbaa18eb4f46282cae7df38bb00
1.1 GB Download
md5:7dab402d45f9a3fcc043d795f78423be
262.7 MB Download
md5:63cf5a97df30b90f84b3ae85918a6cf2
92.9 MB Download
md5:57763d179a9043f72e96ee6ff5a726e2
1.1 GB Download

Additional details

Related works

Is published in
Conference paper: 10.1109/ICRA48891.2023.10160531 (DOI)