1471975
doi
10.5281/zenodo.1471975
oai:zenodo.org:1471975
user-eu
Maurice, Pauline
INRIA Nancy Grand-Est, France
Malaisé, Adrien
INRIA Nancy Grand-Est, France
Clelie Amiot
Universite de Lorraine, France
Paris, Nicolas
Universite de Lorraine, France
Richard, Guy-Junior
Universite de Lorraine, France
Ivaldi, Serena
INRIA Nancy Grand-Est, France
Rochel, Olivier
INRIA Nancy Grand-Est, France
Fritzsche, Lars
IMK Automotive, Germany
Malaisé, Adrien
INRIA Nancy Grand-Est, France
Ivaldi, Serena
INRIA Nancy Grand-Est, France
Rochel, Olivier
INRIA Nancy Grand-Est, France
Amiot, Clelie
Universite de Lorraine, France
Paris, Nicolas
Universite de Lorraine, France
Richard, Guy-Junior
Universite de Lorraine, France
Fritzsche, Lars
IMK Automotive, Germany
AndyData-lab-onePerson
Maurice, Pauline
INRIA Nancy Grand-Est, France
info:eu-repo/semantics/openAccess
Creative Commons Attribution 4.0 International
https://creativecommons.org/licenses/by/4.0/legalcode
Human motion measurements
Work-related musculoskeletal disorders
Posture annotation
Action annotation
Activity recognition
Ergonomics
Ergonomic Assessment Worksheet
Industrial activities
<p>This datasets contains motion and force measurements of humans performing various manual tasks, as well as annotations of the actions and postures adopted by the participants. 13 participants performed a series of activities mimicking industrial tasks, such as screwing at different heights and manipulating loads (15 trials per participant, duration of one trial: between 1.5 and 2 min). Participants' whole-body kinematics and hand contact pressure force were recorded. Whole-body kinematics was recorded both with optical (gold standard) and inertial motion capture systems. Hand pressure force was recorded with a prototype glove equipped with pressure sensors. Videos of the participants performing the activities were then annotated by 3 human annotator, to specify the action performed and the posture adopted in each frame of the video. Posture taxonomy follows the Ergonomic Assessment Worksheet (EAWS) postural grid. Action taxonomy defines elementary actions such as reaching, carrying, picking.</p>
<p>All data files are provided in proprietary format (when existing), in standard motion analysis format and in csv format. Annotations are provided in csv format. Videos of a human avatar replaying participants' motion are also provided (annotations were performed on those videos).</p>
Acknowledgement: Part of the equipment used to create the database was funded by the CPER IT2MP of Région Grand-Est, France.
Zenodo
2018-10-26
info:eu-repo/semantics/other
1471974
user-eu
1
award_title=Advancing Anticipatory Behaviors in Dyadic Human-Robot Collaboration; award_number=731540; award_identifiers_scheme=url; award_identifiers_identifier=https://cordis.europa.eu/projects/731540; funder_id=00k4n6c32; funder_name=European Commission;
1641813453.422001
2723211816
md5:1c0e0c250ef5e65b8b6df76edbe55fe4
https://zenodo.org/records/1471975/files/videos.zip
77119751337
md5:ab01a9e8dcca967228268bca10c5cdea
https://zenodo.org/records/1471975/files/xsens.zip
3639196
md5:6144811223cfd7ae9ab2904060de52af
https://zenodo.org/records/1471975/files/annotations.zip
7730216685
md5:c83163cc2b38405ac77e28390adf7ad7
https://zenodo.org/records/1471975/files/qualisys.zip
18084826
md5:6f2da057e8a37c89c5acabfd3fc093f9
https://zenodo.org/records/1471975/files/glove.zip
1169634
md5:e5cacb27ffbdef142c376d6d58705178
https://zenodo.org/records/1471975/files/doc.zip
1109
md5:55730e0c3fe2ae772efb2f7504e39275
https://zenodo.org/records/1471975/files/participants.zip
public
10.5281/zenodo.1471974
isVersionOf
doi