There is a newer version of the record available.

Published July 14, 2020 | Version v1
Dataset Open

The Greek Sign Language (GSL) Dataset

Description

Abstract

The Greek Sign Language (GSL) is a large-scale RGB+D dataset, suitable for Sign Language Recognition (SLR) and Sign Language Translation (SLT). The video captures are conducted using an Intel RealSense D435 RGB+D camera at a rate of 30 fps. Both the RGB and the depth streams are acquired in the same spatial resolution of 848×480 pixels. To increase variability in the videos, the camera position and orientation is slightly altered within subsequent recordings. Seven different signers are employed to perform 5 individual and commonly met scenarios in different public services. The average length of each scenario is twenty sentences.

Description

The dataset contains 10,290 sentence instances, 40,785 gloss instances, 310 unique glosses (vocabulary size) and 331 unique sentences, with 4.23 glosses per sentence on average. Each signer is asked to perform the pre-defined dialogues five consecutive times. In all cases, the simulation considers a deaf person communicating with a single public service employee. The involved signer performs the sequence of glosses of both agents in the discussion. For the annotation of each gloss sequence, GSL linguistic experts are involved. The given annotations are at individual gloss and gloss sequence level. A translation of the gloss sentences to spoken Greek is also provided.

 

Evaluation

The GSL dataset includes the 3 evaluation setups:

  • a) Signer-dependent continuous sign language recognition (GSL SD) – roughly 80% of videos are used for training, corresponding to 8,189 instances. The rest 1,063 (10%) were kept for validation and 1,043 (10%) for testing.
  • b) Signer-independent continuous sign language recognition (GSL SI) – the selected test gloss sequences are not used in the training set, while all the individual glosses exist in the training set. In GSL SI, the recordings of one signer are left out for validation and testing (588 and 881 instances, respectively). The rest 8821 instances are utilized for training.
  • c) Isolated gloss sign language recognition (GSL isol.) – The validation set consists of 2,231 gloss instances, the test set 3,500, while the remaining 34,995 are used for training. All 310 unique glosses are seen in the training set.


Each zip file contains the videos for one scenario. Zip files named Depth contain the depth images of each video. The supplementary.zip contains the annotation files for the videos and the evaluation splits.

Citation

If you use our dataset please cite our work :

@misc{adaloglou2021comprehensive,
      title={A Comprehensive Study on Deep Learning-based Methods for Sign Language Recognition},
      author={Nikolas Adaloglou and Theocharis Chatzis and Ilias Papastratis and Andreas Stergioulas and Georgios Th. Papadopoulos and Vassia Zacharopoulou and George J. Xydopoulos and Klimnis Atzakas and Dimitris Papazachariou and Petros Daras},
      year={2021},
      eprint={2007.12530},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

 

@ARTICLE{9393618,
  author={N. M. {Adaloglou} and T. {Chatzis} and I. {Papastratis} and A. {Stergioulas} and G. T. {Papadopoulos} and V. {Zacharopoulou} and G. {Xydopoulos} and K. {Antzakas} and D. {Papazachariou} and P. n. {Daras}},
  journal={IEEE Transactions on Multimedia}, 
  title={A Comprehensive Study on Deep Learning-based Methods for Sign Language Recognition}, 
  year={2021},
  volume={},
  number={},
  pages={1-1},
  doi={10.1109/TMM.2021.3070438}}

Notes

Acknowledgments This work was supported by the Greek General Secretariat of Research and Technology under contract Τ1ΕΔΚ-02469 EPIKOINONO. The authors would like to sincerely thank their collaborators, Prof. Dimitris Papazachariou, Prof. Klimnis Atzakas, George J. Xydopoulos, and Vassia Zacharopoulou, from the department of Philology of the University of Patras, who provided their meaningful insights and expertise that greatly assisted this research. We would also like to express our gratitude to the Greek sign language center for their valuable feedback and contribution on the Greek sign language data capture.

Files

health1.zip

Files (134.8 GB)

Name Size Download all
md5:2be4fccd3d1604aa86fe766cbbb6816f
1.6 GB Preview Download
md5:1dfeae5c64433eabb075f97b2c51223b
4.1 GB Preview Download
md5:94350163ff6970c65e248a3cb3c24bc3
2.3 GB Preview Download
md5:7ca2d437ce76c365e0cac6a929a61819
6.3 GB Preview Download
md5:bcd5079ffe6d85737754dad268a5ef57
1.4 GB Preview Download
md5:cf3a174826761dcfcfd3d4c13f6c2a10
3.7 GB Preview Download
md5:fa1ba531dca26e869751fc928bd94cb0
3.6 GB Preview Download
md5:b1d05f99ffcccfa3115cae2d0d22e428
5.1 GB Preview Download
md5:1aae788e20c3d18056edf1244c61739b
3.0 GB Preview Download
md5:b685eb0becdd26449b00b4faa7008f11
3.6 GB Preview Download
md5:ae6f2fe5bbb475a680e00fddf781084d
2.8 GB Preview Download
md5:e580d834fbe647bfec2779c360170890
7.3 GB Preview Download
md5:d1aa6f2e69767400256d1686b5cc1929
3.8 GB Preview Download
md5:2a7a19df39f27c159344343bdea990a3
9.5 GB Preview Download
md5:97cd83ef035876263d084d57217deb0d
2.5 GB Preview Download
md5:69a5a485a413be988bff458942c26d62
6.2 GB Preview Download
md5:b5c55ac0320370405de588018f6c6e9c
2.9 GB Preview Download
md5:49f0891d915a31c26ca337389ad240e4
7.9 GB Preview Download
md5:716c5d62a82f14fe76aee21911b48118
5.4 GB Preview Download
md5:120eb04029a2fc1ed4974e2b37d6d876
13.8 GB Preview Download
md5:fba5de41fb1e58945ff2357a94b78504
2.0 GB Preview Download
md5:3824c8ada627ae9838ae156093b23d80
4.6 GB Preview Download
md5:45631c95300a6389a14dc45d50ed7578
2.5 GB Preview Download
md5:c2c07920622fb6da66e5914ebec5bb49
6.3 GB Preview Download
md5:0b89a1e0c84e80cb46452824f0e881cb
1.7 GB Preview Download
md5:e0dcb68adcddcad6d99394be87876ce7
4.5 GB Preview Download
md5:712b10dfae5e39854534d19cb035479e
2.4 GB Preview Download
md5:a3322486f3bda99354721ad4fb17e161
5.9 GB Preview Download
md5:44614c977e369647267f0f08a8b748c1
2.2 GB Preview Download
md5:a9885e13ec28275aea8f6946bd908307
5.9 GB Preview Download
md5:3383b3785dcde4e0f255c6d9b5a42e0f
301.5 kB Preview Download

Additional details

Related works

Is cited by
Preprint: https://arxiv.org/abs/2007.12530 (URL)
Journal article: 10.1109/TMM.2021.3070438 (DOI)
Is supplement to
Journal article: 10.1109/ACCESS.2020.2993650 (DOI)