Published November 29, 2023 | Version 1.0.1
Dataset Open

EMG from Combination Gestures following Visual Prompts

Description

Dataset of surface EMG recordings from 10 subjects performing single and combination gestures, from Fast and Expressive Gesture Recognition using a Combination-Homomorphic Electromyogram Encoder.

Each subject contributes 1224 gesture examples (584 single gestures and 640 combination gestures).

For more details and example usage, see the following:

Contents

Dataset of single and combination gestures from 10 subjects. Each subject contributes 1224 gesture examples (584 single gestures and 640 combination gestures).

Data

Each gesture trial consists of 500 ms of activity from 8 recording electrodes around the mid forearm.

Recording was performed at 1926 Hz with built-in 20–450 Hz bandpass filtering applied.

Data Collection Details

In each gesture trial, subjects began from a neutral position, with the arm on an armrest. Subjects followed visual cues to perform either a direction gesture, a modifier gesture, or both. The timing structure and number of gestures obtained from each visual prompt varied across experimental blocks.

Data for a single gesture trial come from the central 500ms when that gesture is being performed (the window is centered to be as far from gesture onset and gesture end as possible).

See the paper for more details.

Labels

Each gesture trial in the dataset has a two-part label, describing which gesture was performed.

The first label component describes the direction gesture, and takes values in {0, 1, 2, 3, 4}, with the following meaning:

  • 0 - "Up" (wrist extension)
  • 1 - "Down" (wrist flexion)
  • 2 - "Left" (wrist abduction; movement towards thumb side)
  • 3 - "Right" (wrist adduction; movement towards pinky side)
  • 4 - "NoDirection" (absence of a direction gesture; none of the above)

The second label component describes the modifier gesture, and takes values in {0, 1, 2, 3, 4}, with the following meaning:

  • 0 - "Pinch" (connecting thumb and index finger)
  • 1 - "Thumb" (touching thumb to first knuckle of index finger)
  • 2 - "Fist" (all fingers curled)
  • 3 - "Open" (all fingers extended)
  • 4 - "NoModifier" (absence of a modifier gesture; none of the above)

Each subject provided 1224 total examples, consisting of:

  • - 584 single gesture examples. (584 = 73 examples, for each of the 4+4 possible single gesture classes)
  • - 640 combination gesture examples. (640 = 40 examples, for each of the 4x4 possible combination gesture classes)
     

Examples of Label Structure

Single gestures have labels like (0, 4) indicating ("Up", "NoModifier") or (4, 3) indicating ("NoDirection", "Open").

Combination gesture have labels like (0, 0) indicating ("Up", "Pinch") or (2, 3) indicating ("Left", "Open").

Loading data

After downloading and unzipping, the examples below can be used to load data in Python or MATLAB.

Data have shape `(items, channels, timesteps)`.

Labels have shape `(items, 2)`, where the first coordinate is the direction and the second coordinate is the modifier.
 

Python example

# Load data

import numpy as np

from pathlib import Path

 

folder = Path("path/to/combination-gesture-dataset/python")
 

all_data, all_labels = [], []

for i in range(10):

    data = np.load(f"{folder}/subj{i}/data.npy")

    labels = np.load(f"{folder}/subj{i}/labels.npy")

    all_data.append(data)

    all_labels.append(labels)

all_data = np.concatenate(all_data, axis=0)

all_labels = np.concatenate(all_labels, axis=0)

 

print(all_data.shape)   # (12240, 8, 962)

print(all_labels.shape)   # (12240, 2)

 

# Convert numeric labels to text

def convert_label(label):

    directions = ["Up", "Down", "Left", "Right", "NoDirection"]

    modifiers = ["Pinch", "Thumb", "Fist", "Open", "NoModifier"]

    return directions[label[0]], modifiers[label[1]]

 

print(convert_label(all_labels[0]))   # ('NoDirection', 'Fist')

 

MATLAB example

% Load data

folder = "path/to/combination-gesture-dataset/matlab";
 

all_data = [];

all_labels = [];

for i = 0:9

    data = load(sprintf("%s/subj%d/data.mat", folder, i));

    labels = load(sprintf("%s/subj%d/labels.mat", folder, i));

    all_data = [all_data; data.contents];

    all_labels = [all_labels; labels.contents];

end

 

% (Optional) convert labels from 0-indexed to 1-indexed, for easier conversion to text

all_labels = all_labels + 1;

 

size(all_data) % (12240, 8, 962)

size(all_labels) % (12240, 2)
 

% Convert numeric labels to text

directions = ["Up", "Down", "Left", "Right", "NoDirection"];

modifiers = ["Pinch", "Thumb", "Fist", "Open", "NoModifier"];

convert_label = @(label) [directions(label(1) + 1), modifiers(label(2) + 1)];

 

convert_label(all_labels(1, :)) % "NoDirection" "Fist"

Files

combination-gesture-dataset.zip

Files (336.7 MB)

Name Size Download all
md5:ed68bfa926971e5472d2dc110c8905b9
336.7 MB Preview Download
md5:987ed52e12cd8e7007ee86fded4197a7
4.6 kB Preview Download

Additional details

Related works

Is published in
Preprint: https://arxiv.org/pdf/2311.14675.pdf (URL)
Is supplement to
Software: https://github.com/nik-sm/com-hom-emg (URL)