Planned intervention: On Thursday 19/09 between 05:30-06:30 (UTC), Zenodo will be unavailable because of a scheduled upgrade in our storage cluster.
Published July 23, 2018 | Version 1.0-alpha
Dataset Open

UC2017 Static and Dynamic Hand Gestures

  • 1. University of Coimbra
  • 2. École Nationale d'Arts et Métiers

Description

We introduce the UC2017 static and dynamic gesture dataset. Most researchers use vision-based systems such as the Microsoft Kinect to acquire and classify hand gesture data. Despite that, we believe that we can achieve more reliable results and allow the use of more complex gestures with wearable systems. There are not many datasets with wearable systems due to the plethora of data gloves in the market and their relative high cost. For these reasons, we opted by creating a new dataset to present and evaluate our gesture recognition framework. The objectives of the dataset are: (1) provide a superset of hand gestures for HRI, (2) have user variability, (3) to be representative of the actual gestures performed in a real-world interaction.

We divide the dataset in two types of gestures: SG and DG. SG are described by a single timestep of data, therefore representing a single hand pose and orientation. DGs are variable-length timeseries of poses and orientations with particular meanings. Some of the gestures of the dataset are correlated with a certain meaning in the context of HRI, while others are arbitrary, to enrich the dataset and add complexity to the classification problem.

The library is composed of 24 SG classes and 10 DG. The dataset includes SG data from eight subjects with a total of 100 repetitions for each of the 24 classes (2400 samples in total). The DG samples were obtained from six subjects and has cumulatively 131 repetitions of each class (1310 samples in total). All of the subjects are right-handed and performed the gestures with their left hand.

We used a data glove (CyberGlove II) and a magnetic tracker (Polhemus Liberty) to capture the hand shape, position and orientation over time. The glove provides digital signals that are proportional to the bending angle of each one of the 22 sensors which are elastically attached to a subset of the hand's joints. In this way we have an approximation of the hand's shape. The tracker's sensor is rigidly attached to the glove on the wrist and measures its position and orientation in respect to a ground-fixed frame. The orientation is the rotation between the fixed frame and the frame of the sensor, given a quaternion (WXYZ). We fuse the sensor data together online since the sensors have slightly different acquisition rates -- 100Hz for the glove and 120Hz for the tracker. The tracker data are under-sampled by gathering only the closest tracker frame in time.

The files are in the h5df format. The dimensions are (sample, time, variables).

Files

Files (39.7 MB)

Name Size Download all
md5:252bb865963e1b92dbc1b25ea7f93f42
39.0 MB Download
md5:f440378225253025909a66cb1ea98d61
667.1 kB Download

Additional details

Funding

ColRobot – Collaborative Robotics for Assembly and Kitting in Smart Manufacturing 688807
European Commission