Published June 25, 2019 | Version 2
Dataset Open

AndyData-lab-onePerson

  • 1. INRIA Nancy Grand-Est, France
  • 2. Universite de Lorraine, France
  • 3. IMK Automotive, Germany
  • 1. INRIA Nancy Grand-Est, France
  • 2. Universite de Lorraine, France
  • 3. IMK Automotive, Germany

Description

This datasets contains motion and force measurements of humans performing various manual tasks, as well as annotations of the actions and postures adopted by the participants. 13 participants performed a series of activities mimicking industrial tasks, such as setting screws at different heights and manipulating loads (15 trials per participant, duration of one trial: between 1.5 and 2 min). Participants' whole-body kinematics and hand contact pressure force were recorded. Whole-body kinematics was recorded both with optical (gold standard) and inertial motion capture systems. Hand pressure force was recorded with a prototype glove equipped with pressure sensors. Videos of the participants performing the activities were then annotated by 3 human annotator, to specify the action performed and the posture adopted in each frame of the video. Posture taxonomy follows the Ergonomic Assessment Worksheet (EAWS) postural grid. Action taxonomy defines elementary actions such as reaching, carrying, picking.

All data files are provided in proprietary format (when existing), in standard motion analysis format and in csv format. Annotations are provided in csv format. Videos of a human avatar replaying participants' motion are also provided (annotations were performed on those videos).

A detailed description of how the data were collected is available in the paper associated with the dataset: "Human Movement and Ergonomics: an Industry-Oriented Dataset for Collaborative Robotics" (Maurice et al., IJRR, in press) https://hal.archives-ouvertes.fr/hal-02289107/document

Notes

Acknowledgement: Part of the equipment used to create the database was funded by the CPER IT2MP of Région Grand-Est, France.

Files

glove_csv.zip

Files (92.5 GB)

Name Size Download all
md5:b9ba68f36a8467b640b3f54461695e01
20.5 MB Preview Download
md5:81e0df75b69629227ac976873e55c5eb
4.1 MB Preview Download
md5:75a8499a5f336d838b1dc8e3916cd960
1.6 kB Preview Download
md5:0afe99e425b2b2b79b0856704529a215
1.1 GB Preview Download
md5:bf0c1171deeb5772271ab15596a5b517
1.3 GB Preview Download
md5:1f956e05f0b3b5cd8c49899b121cfc60
6.1 GB Preview Download
md5:86910892a468ce6f27c1faa559388737
3.1 GB Preview Download
md5:867473b64ae9cbe8c46f8a73145ab9c5
3.2 GB Preview Download
md5:5ea5ca16d6b8016e5248aa4bbbb68c85
2.8 GB Preview Download
md5:00eb56762140de3ec970cfed3431381c
14.7 GB Preview Download
md5:6f6dee3f8d8ae1ffc64c63bc199ae398
1.8 GB Preview Download
md5:e56743062aa66701e59a205bbb4f84c0
1.4 GB Preview Download
md5:839814fb3fbae523397d0522bc8194c5
15.6 GB Preview Download
md5:3e18df56cdfc90612ec96b99cfa541da
41.4 GB Preview Download

Additional details

Funding

An.Dy – Advancing Anticipatory Behaviors in Dyadic Human-Robot Collaboration 731540
European Commission