Published July 11, 2025 | Version v1
Dataset Open

FRESH (Fusion with RGB and Events for Spacecraft pose estimation under Harsh lighting) Dataset

  • 1. ROR icon University of Adelaide

Description

FRESH (Fusion with RGB and Events for Spacecraft pose estimation under Harsh lighting) is a dataset to evaluate RGB and event sensors for satellite pose estimation under challenging real-world lighting. The RGB and event sensors used to capture the data have been optically aligned via a glass beamsplitter and temporally synchronized using a custom multithreaded software. In the dataset, three 3D-printed models of satellite objects have been captured in rotating sequences in an uncluttered environment. The challenging lighting conditions were engineered using a directional 170k lux (@1m distance) daylight temperature (5600K) LED lamp. This light creates glare, blooming, overexposure and lens flare artifacts which are observed in real orbital scenarios and occlude the structure of the target satellite.

The primary purpose of the dataset is to provide optically and temporally aligned RGB and event data to lead research into event-RGB fusion for satellite pose estimation during harsh lighting conditions. Furthermore, the aligned dataset allows a like-to-like evaluation of both sensors. In particular, an event-based camera (Prophesee Gen 4 with Prophesee-Sony IMX636 sensor) and an RGB sensor (Basler a2A1920-160ucPRO with Sony IMX392 sensor) were used for capturing this dataset.

Ground-truthing was performed by manual labelling of the sequences using Blender (see citation below for full details). Thus, the dataset consists of ground-truth poses of the satellite relative to the capture setup.

The structure of the dataset is as follows:

  • synthetic

    • satty

      • frames

      • train.json

      • test.json

    • cassini

      • frames

      • train.json

      • test.json

    • soho

      • frames

      • train.json

      • test.json

  • real

    • 24 sequences in the format: <object>-<trajectory_index>-<distance>

      • events

        • events.txt

      • frames

      • test.json

      • timestamps.txt

    • harsh-and-slow-frames.json

  • models

    • <object i.e satty/cassini/soho>

      • dense.json: json file containing one key “dense_points” which is a Wx3 list of 3D points representing a dense point cloud used to evaluate the event certifier method of: M. Jawaid, R. Talak, Y. Latif, L. Carlone, T.-J. Chin, Test-time certifiable self-supervision to bridge the sim2real gap in event-based satellite pose estimation, in: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2024.

      • STL file(s) for the satellite models

In each frames directory the frames are in the following formats:

  • %05d_rgb.png (for the RGB frames)

  • %05d_event.png (for the event frames)

 

The events.txt contains the raw event data for each sequence in the following header-less csv format:

  • timestamp, x, y, polarity

Where the timestamp is in microseconds, x and y are the coordinates of the triggered event and the polarity is either 0 or 1 to indicate a positive or negative event.

 

The train.json and test.json annotations files for both synthetic and real data are in the following format:

  • “landmarks_3d”: Nx3 list of 3D landmarks marked on the satellite object

  • “wireframe_points”: Mx3 list of 3D points of a simple wireframe for visualisation of the pose through reprojection.

  • “wireframe_faces”: PxQ list of P faces of the simple wireframe which each Qth element of the face P is an index into the “wireframe_points” list above.

  • “instrinsics”: 3x3 camera intrinsic matrix obtained after calibration and undistortion

  • “annotations”: list of annotations for each frame in the following format:

    • “filename_rgb”: filename in format %05d_rgb.png of the RGB frame this annotation corresponds to

    • “filename_event”: filename in format %05d_event.png of the event frame this annotation corresponds to

    • “pose”: 4x4 pose matrix with both rotation and translation components in OpenCV coordinate system.

    • “keypoints”: Nx2 matrix of the “landmarks_3d” reprojected onto the frame using the “pose”

    • “bbox”: x1,y1,x2,y2 format 2D bounding box of the satellite in the frame.

 

The timestamps.txt files for each sequence in the real data are in the following header-less csv format:

  • %05d_rgb, timestamp

where %05d_rgb is the RGB frame name without the file extension and the corresponding timestamp is in microseconds with a zero padded width of 16 characters.

 

Code relating to this dataset will be maintained at:

https://github.com/mohsij/space-event-rgb-fusion

 

For more details please refer to the following article. If you found this dataset useful, please consider citing our paper:

@article{JAWAID2025111039,
title = {Event-RGB Fusion for Spacecraft Pose Estimation Under Harsh Lighting},
journal = {Aerospace Science and Technology},
pages = {111039},
year = {2025},
issn = {1270-9638},
doi = {https://doi.org/10.1016/j.ast.2025.111039},
url = {https://www.sciencedirect.com/science/article/pii/S1270963825011022},
author = {Mohsi Jawaid and Marcus Märtens and Tat-Jun Chin},
keywords = {event-based pose estimation, rendezvous, domain gap, sensor fusion, close proximity, harsh lighting}}

 

 

Files

models.zip

Files (19.5 GB)

Name Size Download all
md5:12072f9b7bf11cc0ae3d959cc5564aae
12.5 MB Preview Download
md5:82bc34aa54215c19c5595fd3257c20f5
5.0 GB Download
md5:5d2281b4ca5f98a541cc6aed47d6d407
5.0 GB Download
md5:eb09a695270bf8e5e12ff0dc7fc39503
2.6 GB Download
md5:0237e61ebd4704bd189a4ee635205ddd
6.9 GB Preview Download

Additional details

Related works

Is described by
10.1016/j.ast.2025.111039 (DOI)

References

  • M. Jawaid, M. Märtens, and T.-J. Chin, 'Event-RGB Fusion for Spacecraft Pose Estimation Under Harsh Lighting', Aerospace Science and Technology, p. 111039, 2025.