3941811
doi
10.5281/zenodo.3941811
oai:zenodo.org:3941811
Nikolas Adaloglou
Visual Computing Lab, Information Technologies Institute
Andreas Stergioulas
Visual Computing Lab, Information Technologies Institute
Theocharis Chatzis
Visual Computing Lab, Information Technologies Institute
Georgios Th.Papadopoulos
Visual Computing Lab, Information Technologies Institute
Vassia Zacharopoulou
University of Patras
George J. Xydopoulos
University of Patras
Klimnis Atzakas
University of Patras
Dimitris Papazachariou
University of Patras
Kosmas Dimitropoulos
Visual Computing Lab, Information Technologies Institute
Petros Daras
Visual Computing Lab, Information Technologies Institute
The Greek Sign Language (GSL) Dataset
Ilias Papastratis
Visual Computing Lab, Information Technologies Institute
doi:10.1109/ACCESS.2020.2993650
url:https://arxiv.org/abs/2007.12530
doi:10.1109/TMM.2021.3070438
info:eu-repo/semantics/openAccess
Creative Commons Attribution 4.0 International
https://creativecommons.org/licenses/by/4.0/legalcode
sign language recognition
sign language
deep learning
computer vision
dataset
<p><strong>Abstract</strong></p>
<p>The Greek Sign Language (GSL) is a large-scale RGB+D dataset, suitable for Sign Language Recognition (SLR) and Sign Language Translation (SLT). The video captures are conducted using an Intel RealSense D435 RGB+D camera at a rate of 30 fps. Both the RGB and the depth streams are acquired in the same spatial resolution of 848×480 pixels. To increase variability in the videos, the camera position and orientation is slightly altered within subsequent recordings. Seven different signers are employed to perform 5 individual and commonly met scenarios in different public services. The average length of each scenario is twenty sentences.</p>
<p><strong>Description</strong></p>
<p>The dataset contains 10,290 sentence instances, 40,785 gloss instances, 310 unique glosses (vocabulary size) and 331 unique sentences, with 4.23 glosses per sentence on average. Each signer is asked to perform the pre-defined dialogues five consecutive times. In all cases, the simulation considers a deaf person communicating with a single public service employee. The involved signer performs the sequence of glosses of both agents in the discussion. For the annotation of each gloss sequence, GSL linguistic experts are involved. The given annotations are at individual gloss and gloss sequence level. A translation of the gloss sentences to spoken Greek is also provided.</p>
<p> </p>
<p><strong>Evaluation</strong></p>
<p>The GSL dataset includes the 3 evaluation setups:</p>
<ul>
<li>a) Signer-dependent continuous sign language recognition (GSL SD) – roughly 80% of videos are used for training, corresponding to 8,189 instances. The rest 1,063 (10%) were kept for validation and 1,043 (10%) for testing.</li>
<li>b) Signer-independent continuous sign language recognition (GSL SI) – the selected test gloss sequences are not used in the training set, while all the individual glosses exist in the training set. In GSL SI, the recordings of one signer are left out for validation and testing (588 and 881 instances, respectively). The rest 8821 instances are utilized for training.</li>
<li>c) Isolated gloss sign language recognition (GSL isol.) – The validation set consists of 2,231 gloss instances, the test set 3,500, while the remaining 34,995 are used for training. All 310 unique glosses are seen in the training set.</li>
</ul>
<p><br>
Each zip file contains the videos for one scenario. Zip files named Depth contain the depth images of each video. The supplementary.zip contains the annotation files for the videos and the evaluation splits.</p>
<p><strong>Citation</strong></p>
<p>If you use our dataset please cite our work :</p>
<p>@misc{adaloglou2021comprehensive,<br>
title={A Comprehensive Study on Deep Learning-based Methods for Sign Language Recognition},<br>
author={Nikolas Adaloglou and Theocharis Chatzis and Ilias Papastratis and Andreas Stergioulas and Georgios Th. Papadopoulos and Vassia Zacharopoulou and George J. Xydopoulos and Klimnis Atzakas and Dimitris Papazachariou and Petros Daras},<br>
year={2021},<br>
eprint={2007.12530},<br>
archivePrefix={arXiv},<br>
primaryClass={cs.CV}<br>
}</p>
<p> </p>
<pre>@ARTICLE{9393618,
author={N. M. {Adaloglou} and T. {Chatzis} and I. {Papastratis} and A. {Stergioulas} and G. T. {Papadopoulos} and V. {Zacharopoulou} and G. {Xydopoulos} and K. {Antzakas} and D. {Papazachariou} and P. n. {Daras}},
journal={IEEE Transactions on Multimedia},
title={A Comprehensive Study on Deep Learning-based Methods for Sign Language Recognition},
year={2021},
volume={},
number={},
pages={1-1},
doi={10.1109/TMM.2021.3070438}}
</pre>
Acknowledgments
This work was supported by the Greek General Secretariat of Research and Technology under contract Τ1ΕΔΚ-02469 EPIKOINONO.
The authors would like to sincerely thank their collaborators, Prof. Dimitris Papazachariou, Prof. Klimnis Atzakas, George J. Xydopoulos, and Vassia Zacharopoulou, from the department of Philology of the University of Patras, who provided their meaningful insights and expertise that greatly assisted this research. We would also like to express our gratitude to the Greek sign language center for their valuable feedback and contribution on the Greek sign language data capture.
Zenodo
2020-07-14
info:eu-repo/semantics/other
3941810
1620895637.051524
2928137290
md5:b5c55ac0320370405de588018f6c6e9c
https://zenodo.org/records/3941811/files/kep4.zip
2251986261
md5:94350163ff6970c65e248a3cb3c24bc3
https://zenodo.org/records/3941811/files/health2.zip
3707399305
md5:cf3a174826761dcfcfd3d4c13f6c2a10
https://zenodo.org/records/3941811/files/health3_Depth.zip
1383048020
md5:bcd5079ffe6d85737754dad268a5ef57
https://zenodo.org/records/3941811/files/health3.zip
5123616768
md5:b1d05f99ffcccfa3115cae2d0d22e428
https://zenodo.org/records/3941811/files/health4_Depth.zip
6295843062
md5:7ca2d437ce76c365e0cac6a929a61819
https://zenodo.org/records/3941811/files/health2_Depth.zip
1588582341
md5:2be4fccd3d1604aa86fe766cbbb6816f
https://zenodo.org/records/3941811/files/health1.zip
3570954192
md5:fa1ba531dca26e869751fc928bd94cb0
https://zenodo.org/records/3941811/files/health4.zip
4056510700
md5:1dfeae5c64433eabb075f97b2c51223b
https://zenodo.org/records/3941811/files/health1_Depth.zip
3556769792
md5:b685eb0becdd26449b00b4faa7008f11
https://zenodo.org/records/3941811/files/health5_Depth.zip
2977042758
md5:1aae788e20c3d18056edf1244c61739b
https://zenodo.org/records/3941811/files/health5.zip
7285418895
md5:e580d834fbe647bfec2779c360170890
https://zenodo.org/records/3941811/files/kep1_Depth.zip
2830292812
md5:ae6f2fe5bbb475a680e00fddf781084d
https://zenodo.org/records/3941811/files/kep1.zip
9541364298
md5:2a7a19df39f27c159344343bdea990a3
https://zenodo.org/records/3941811/files/kep2_Depth.zip
3800915100
md5:d1aa6f2e69767400256d1686b5cc1929
https://zenodo.org/records/3941811/files/kep2.zip
6204477651
md5:69a5a485a413be988bff458942c26d62
https://zenodo.org/records/3941811/files/kep3_Depth.zip
2509750892
md5:97cd83ef035876263d084d57217deb0d
https://zenodo.org/records/3941811/files/kep3.zip
7934551778
md5:49f0891d915a31c26ca337389ad240e4
https://zenodo.org/records/3941811/files/kep4_Depth.zip
13796411139
md5:120eb04029a2fc1ed4974e2b37d6d876
https://zenodo.org/records/3941811/files/kep5_Depth.zip
5378716117
md5:716c5d62a82f14fe76aee21911b48118
https://zenodo.org/records/3941811/files/kep5.zip
2445367561
md5:712b10dfae5e39854534d19cb035479e
https://zenodo.org/records/3941811/files/police4.zip
5881402882
md5:a9885e13ec28275aea8f6946bd908307
https://zenodo.org/records/3941811/files/police5_Depth.zip
4594641045
md5:3824c8ada627ae9838ae156093b23d80
https://zenodo.org/records/3941811/files/police1_Depth.zip
2193971845
md5:44614c977e369647267f0f08a8b748c1
https://zenodo.org/records/3941811/files/police5.zip
301493
md5:3383b3785dcde4e0f255c6d9b5a42e0f
https://zenodo.org/records/3941811/files/supplementary.zip
4509150550
md5:e0dcb68adcddcad6d99394be87876ce7
https://zenodo.org/records/3941811/files/police3_Depth.zip
2533723307
md5:45631c95300a6389a14dc45d50ed7578
https://zenodo.org/records/3941811/files/police2.zip
6310422981
md5:c2c07920622fb6da66e5914ebec5bb49
https://zenodo.org/records/3941811/files/police2_Depth.zip
2035803186
md5:fba5de41fb1e58945ff2357a94b78504
https://zenodo.org/records/3941811/files/police1.zip
5874799347
md5:a3322486f3bda99354721ad4fb17e161
https://zenodo.org/records/3941811/files/police4_Depth.zip
1739634575
md5:0b89a1e0c84e80cb46452824f0e881cb
https://zenodo.org/records/3941811/files/police3.zip
public
10.1109/ACCESS.2020.2993650
Is supplement to
doi
https://arxiv.org/abs/2007.12530
Is cited by
url
10.1109/TMM.2021.3070438
Is cited by
doi
10.5281/zenodo.3941810
isVersionOf
doi