Benchmark movement data set for trust assessment in human robot collaboration
Contributors
Data collector:
Description
In the Drapebot project, a worker is supposed to collaborate with a large industrial manipulator in two tasks: collaborative transport of carbon fibre patches and collaborative draping. To realize data-driven trust assessement, the worker is equipped with a motion tracking suit and the body movement data is labeled with the trust scores from a standard Trust questionnaire (Trust perception scale - HRI, Schaefer 2016).
Data has been collected in the transport and draping tasks (counterbalanced) from 20 participants, 7 female and 13 male, average age 25 (SD = 4.0). Average height was 1.74 meters (SD = 0.1). One session consists of 24 trials on average for the transport and draping task resulting in 951 trials across all conditions. For all sessions, body tracking was performed using the Xsens MVN Awinda tracking suit. It consists of a tight-fitting shirt, gloves, headband, and a series of straps used to attach 17 IMUs to the participant. After calibration the system uses inverse kinematics to track and log the movements of the participant at a rate of 60 Hz. The measurements include linear and angular speed, velocity, and acceleration of every skeleton tracking point (see XSENS manual for a detailed description of avaiable measurements).
Data organization
There are 20 files for 20 participants of each task accordingly (transport and draping). The name of the files is P01SD, where the number 01 is the participant the D stands for draping. Accordingly, P01ST stands for transport. Each file contains all the data that was generated from the XSENS motion capture system. The files are xlsx files and for each sheet inside the excel file there are different types of data:
- Segment Orientation - Quat
- Segment Orientation - Euler
- Segment Position
- Segment Velocity
- Segment Acceleration
- Segment Angular Velocity
- Segment Angular Acceleration
- Joint Angles ZXY
- Joint Angles XZY
- Ergonomic Joint Angles ZXY
- Ergonomic Joint Angles XZY
- Center of Mass
- Sensor Free Acceleration
- Sensor Magnetic Field
- Sensor Orientation - Quat
- Sensor Orientation - Euler
See also: https://base.movella.com/s/article/Output-Parameters-in-MVN-1611927767477?language=en_US
For more information on each specific data and/or sensors please see the xsens manual (Link above)
Data Annotation
For each procedure there is an annotation file called sorted_draping.xlsx and sorted_transport.xlsx. In these files the first column is the frame and from column 2 until column 21 are the annotations for each procedure for each participant. The annotations describe the different phases during the procedures for each data frame recorded by xsens:
- Transport phases: pick, transport, drop, return
- Draping phases: approach, draping, return
The file trustscores.xlsx includes some demographic data as well as the results of the trust questionaire for each participant and each task, including the scores for the individual items as well as the calculated trust score. The different columns are:
- Subject: participant number for crossreferencing with annotation and movement data
- Transport.Speed: denoting the robot speed (fast or slow)
- Age: age of the participant
- Gender: gender of the participant
- DominantHand: dominant hand of the participant (left or right)
- Height: height of the participant
- Score for answers of the participant in related questions category.
This is followed by the trust questionaire items:
- Which % of time does the robot
- Function successfully
- Act consistently
- Communicate with people
- Provide feedback
- Malfunction
- Follow directions
- Meet the needs of the mission
- Perform exactly as instructed
- Have errors
- Which % of the time is the robot:
- Unresponsive
- Dependable
- Reliable
- Predictable
The last two columns are
- TrustScore – Final trust score calculated from all questions
- Task – Which task is being performed (Transport/Draping)
Files
draping.zip
Files
(19.4 GB)
Name | Size | Download all |
---|---|---|
md5:cdae329df5e31ad7b8bb5a35314cdcc9
|
9.7 GB | Preview Download |
md5:f7e4ae2f322f50c514526552a5bde247
|
2.6 MB | Download |
md5:58316ca32a05f5d88cbb4029f7a18af0
|
2.8 MB | Download |
md5:9adea139f97004a5bb89e41aa41afb59
|
9.7 GB | Preview Download |
md5:20858eb2119d9fed194cf37c6cd2bb95
|
16.9 kB | Download |
Additional details
Software
- Repository URL
- https://github.com/HRI-AAU/DrapebotExample
- Programming language
- Jupyter Notebook
- Development Status
- Active
References
- Matthias Rehm, Kasper Hald, and Ioannis Pontikis. 2024. Benchmark Movement Data Set for Trust Assessment in Human Robot Collaboration. In Proceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction (HRI '24). Association for Computing Machinery, New York, NY, USA, 934–938. https://doi.org/10.1145/3610977.3637472