OpenPack: Public multi-modal dataset for packaging work recognition in logistics domain
- 1. Graduate School of Information Science and Technology, Osaka University
Description
OpenPack is an open-access logistics dataset for human activity recognition, which contains human movement and package information from 16 subjects in four scenarios. Human movement information is subdivided into three types of data, acceleration, physiological, and depth-sensing. The package information includes the size and number of items included in each packaging job.
In the "Humanware laboratory" at IST Osaka University, with the supervision of industrial engineers, an experiment to mimic logistic center labor was designed. 12 workers with previous packaging experience and 4 without experience performed a set of packaging tasks according to an instruction manual from a real-life logistics center. During the different scenarios, subjects were recorded while performing packing operations using Lidar, Kinect, and Realsense depth sensors while wearing 4 ATR IMU devices and 2 Empatica E4 wearable sensors. Besides sensor data, this dataset contains timestamp information collected from the hand terminal used to register product, packet, and address label codes as well as package details that can be useful to relate operations to specific packages.
The 4 different scenarios include; sequential packing, worker-decided sequence changes, pre-ordered item packing, and time-sensitive stressors. Each of the subjects performed 20 packing jobs in 5 work sessions for a total of 100 packing jobs. 53+ hours of packaging operations have been labeled into 10 global operation classes and 16 sub-action classes for this dataset. Action classes are not unique to each operation but may only appear in one or two operations.
You can find information on how to use this dataset at: https://open-pack.github.io/. For details on how this dataset was collected please check the following publication "OpenPack: A Large-Scale Dataset for Recognizing Packaging Works in IoT-Enabled Logistic Environments" 10.1109/PerCom59722.2024.10494448.
Full Dataset
In this repository, the data and label files are contained in separate files for each worker. Each worker's file contains; IMU, E4, 2d keypoint, 3d keypoint, annotation, and system-related data.
Preprocessed Dataset (IMU with operation and action Labels)
We have received many comments that it was difficult to combine multiple workers' IMU and annotation data. Therefore, we have created several CSV files containing the four IMU's sensor data and the operation labels in a single file. These files are now included as "imu-with-operation-action-labels.zip".
Preprocessed Dataset (Kinect 2D and 3D keypoint data with operation and action Labels)
We have received several requests for a preprocessed dataset containing only specific types of keypoint data with its assigned operation and action labels. Two new preprocessed files have been added for 2D and 3D keypoint data extracted from the frontal view Kinect camera. These files are:
"kinect-2d-kpt-with-operation-action-labels.zip", and
"kinect-3d-kpt-with-operation-action-labels.zip".
Work is continuously being done to update and improve this dataset. When downloading and using this dataset please verify that the version is up to date with the latest release. The latest release [1.1.0] was uploaded on 24/04/2024.
Changes LOG:
- v1.0.0: Add tutorial preprocessed dataset for IMU data with operation labels.
- v1.1.0: Update preprocessed datasets. (Include Kinect 2d and 3d keypoint data with Operation and action labels)
We hosted an activity recognition competition using this dataset (OpenPack v0.3.x) awarded at a PerCom 2023 Workshop! The task was very simple: Recognize 10 work operations from the OpenPack dataset. You can refer to this website for coding materials relevant to this dataset. https://open-pack.github.io/challenge2022
Notes
Files
imu-with-operation-action-labels.zip
Files
(8.7 GB)
Name | Size | Download all |
---|---|---|
md5:6e36d3a35dd43cedb167346007a5a2d6
|
524.1 MB | Preview Download |
md5:f03d864f5a5ec9931ef0376e4f8dbaf9
|
473.6 MB | Preview Download |
md5:52bcb733aebfadfeef0119f28e70558c
|
2.1 GB | Preview Download |
md5:7c182630795ff0e93393a781a076be6f
|
241.0 MB | Preview Download |
md5:3c32acce47a27437bfb39eed8a0e0199
|
291.1 MB | Preview Download |
md5:a53e42137495f751806aad42ff458077
|
232.1 MB | Preview Download |
md5:ab237d082477f754f70ef626be842474
|
212.3 MB | Preview Download |
md5:7b59285a139d8ce1acedda54a43ef4fd
|
229.2 MB | Preview Download |
md5:0b2300dcf16b62b36fcdd570897bc3d3
|
313.8 MB | Preview Download |
md5:53b77237e22aa92a0052e5e2ca31193a
|
213.4 MB | Preview Download |
md5:99a053debfd2783d1ec273fec684bfba
|
341.7 MB | Preview Download |
md5:255755b6b92a402030efaa6bff09c67c
|
351.2 MB | Preview Download |
md5:2e73677d0b54a32256646e421025d49b
|
208.0 MB | Preview Download |
md5:48e3b88ff9c17eb77c1600b67d0a4817
|
341.4 MB | Preview Download |
md5:11cf8ff7b01ecf3b8f2c66f743ac92f0
|
278.9 MB | Preview Download |
md5:0f6f5f6e06954137f479413d3fcc638f
|
283.9 MB | Preview Download |
md5:79ed0af054f469ee05db6554b6bbbb4f
|
283.1 MB | Preview Download |
md5:b1a4f9f5054bc4d04805b8e0ad80561a
|
226.0 MB | Preview Download |
md5:5040fa7a3179e497b569481e6b8778c6
|
290.8 MB | Preview Download |
md5:1b787bf3bef02b2cb249aa614beb9a16
|
231.8 MB | Preview Download |
md5:312b0d54044fc1dff78a34444bd25242
|
226.9 MB | Preview Download |
md5:031bac6c5719ba6a7a46d1611f5c94e2
|
255.8 MB | Preview Download |
md5:5f1c635d3c5a313ddfe9c4e6e699af39
|
286.2 MB | Preview Download |
md5:001f767338eeea57f53afdf29d7f6128
|
246.7 MB | Preview Download |
Additional details
Related works
- Is version of
- Other: arXiv:2212.11152 (arXiv)
- Conference paper: 10.1109/PerCom59722.2024.10494448 (DOI)
Dates
- Available
-
2024-04-24