Published February 2025 | Version v1
Dataset Open

Lost & Found: Tracking Changes from Egocentric Observations in 3D Dynamic Scene Graphs

  • 1. ROR icon ETH Zurich
  • 2. Microsoft
  • 3. ROR icon University of Bonn

Description

Lost & Found: Tracking Changes from Egocentric Observations in 3D Dynamic Scene Graphs - Dataset

Description:

This dataset accompanies the paper "Lost & Found: Tracking Changes from Egocentric Observations in 3D Dynamic Scene Graphs", accepted at IEEE Robotics and Automation Letters (RA-L). The dataset provides ground truth and predicted trajectories for object tracking in egocentric pick-and-place interactions, supporting research in dynamic scene understanding and pose tracking.

The dataset consists of 96 pick-and-place sequences involving 9 different objects, captured from an egocentric viewpoint with Aria glasses. Ground truth was obtained via a Vicon motion capture system by attaching motion capture markers to the rigid surface of the carried objects. The alignment between the Vicon and Aria coordinate systems was achieved by leveraging the SLAM output of the Aria glasses in conjunction with motion capture markers on the glasses itself.

This dataset enables quantitative evaluation of object tracking performance, including translational and rotational error as well as ADD(-S) score computation, and facilitates the reproduction of the results reported in the paper (Please note that due to some non-deterministic components of the Lost & Found pipeline, reproduced results may have a slight variation.)

Contents:

Due to the large number of files, we upload our dataset as zip files of the two main components - the initial scan (Final_Scan.zip) as well as the 96 egocentric recordings themselves (Final_Dataset.zip).  

Scan:

Output of the 3D scanner app (https://3dscannerapp.com/). Contains the 3D scene in different formats (mesh.ply, textured_output.obj) as well as per frame camera information and RGB images. Additionally, we processed the recording via Mask3D into an instance segmentation. Output can be found in subfolder "pred_mask" and mesh_labeled.ply.

Recordings:

This folder contains 96 subfolders with each of the egocentric recordings as well as the ground truth trajectory of the carried object and the respective prediction of our method Lost & Found. Each folder_name indicates the carried object and contains the following:

    • gt_object.csv - ground truth 6DoF trajectory of the object
    • pred_object.csv - predicted 6DoF trajectory of the object
    • gt_glasses.csv - ground truth 6DoF trajectory of the aria glasses (used for alignment)
    • pred_glasses.csv - predicted 6DoF trajectory of the object (used for alignment)
    • object_points.npy - 3D points of carried object in object coordinate system (ADD(-S) evaluation)
    • folder_name.vrs - Aria output
    • folder_name.vrs.json - Metadata for the Aria output
    • mps_folder_name_vrs - subfolder containing head poses + hand positions from the Aria Machine Perception Services (MPS)
    • icp_aligned_pose.npy - transformation from the iPad Scan (Final_Scan) to the respective recordings
    • results.json - result metrics of the Lost & Found predictions

For the convenience of the user, all trajectories have been transformed to the coordinate system of each recording, such that they are consistent with all the hand poses and hand positions and all other meta information found in the MPS  and Aria output. Some content is here just for convenience, for your own evaluation only the following files are integral: gt_object.csv, folder_name.vrs and potentially mps_folder_name_vrs.

Webpage:

Evaluation code can be found in the referenced GitHub repository (specifically the code in src/evaluation). For more details about the project, visit our webpage: https://behretj.github.io/LostAndFound/.

Files

Final_Dataset.zip

Files (30.6 GB)

Name Size Download all
md5:14940440b31e3cb957419e73f0099246
30.2 GB Preview Download
md5:0819d8f2b6e91f1dff34780086ead802
433.8 MB Preview Download

Additional details

Related works

Is described by
Preprint: arXiv:2411.19162 (arXiv)

Funding

ETH Zurich
ETH AI Center
ETH Zürich Foundation
ETH Foundation Project 2025-FS-352
Swiss National Science Foundation
Advanced Grant 216260
Lamarr Institute for Machine Learning and Artificial Intelligence
Lamarr Institute for Machine Learning and Artificial Intelligence

Software

Repository URL
https://github.com/behretj/LostFound
Programming language
Python