Published February 27, 2023 | Version v1
Dataset Open

NexusStreets: a dataset combining human and autonomous driving behaviours

  • 1. NEC Laboratories Europe
  • 2. Flyhound

Description

The NexusStreets dataset contains human and autonomous driving scenes. They are collected by monitoring a target vehicle that can be either autonomous or controlled by a human driver. Data is presented in the shape of:

  • sequences of JPEG images, one image per timestamp
  • target vehicle state information for each timestamp

The dataset has been built on the CARLA simulator, thanks to Baidu Apollo and a Logitech G29 steering wheel for the autonomous and human drivings, respectively.
The dataset consists of 520 scenes (260 pairs of mirrored scenarios) of 60 seconds each.
The folders are organized as follows:

.
├── ...
├── <driving mode>       
│   ├── <town>      
│   │   ├── <trial> 
│   │   │   └── ...   
│   │   └── ... 
│   └── ...
└── ...
  • driving mode: corresponds to the control modality of the target vehicle under test and can be either Baidu Apollo or manual driving;
  • town: one of the five default maps in CARLA (e.g., Town01, Town02, etc);
  • trial: 60 different trials per map, they differ in traffic and weather conditions (except Town04). Each trial records 60 seconds of simulation, logging 120 frames per video and an equal number of rows per CSV. In particular, each trial includes:
    • video: this folder groups the JPEG images;
    • state_features.csv: reports the state information of the target vehicle for each frame;
    • detection_features.csv: reports the 2D bounding box detections obtained from a pre-trained YOLOv7 detector.

Files

NexusStreets.zip

Files (21.4 GB)

Name Size Download all
md5:d7506c38fc14396634fa26792f505ada
21.4 GB Preview Download