Published November 1, 2018 | Version 10009870
Journal article Open

A Motion Dictionary to Real-Time Recognition of Sign Language Alphabet Using Dynamic Time Warping and Artificial Neural Network

Description

Computacional recognition of sign languages aims to
allow a greater social and digital inclusion of deaf people through
interpretation of their language by computer. This article presents
a model of recognition of two of global parameters from sign
languages; hand configurations and hand movements. Hand motion
is captured through an infrared technology and its joints are built
into a virtual three-dimensional space. A Multilayer Perceptron
Neural Network (MLP) was used to classify hand configurations and
Dynamic Time Warping (DWT) recognizes hand motion. Beyond
of the method of sign recognition, we provide a dataset of
hand configurations and motion capture built with help of fluent
professionals in sign languages. Despite this technology can be
used to translate any sign from any signs dictionary, Brazilian
Sign Language (Libras) was used as case study. Finally, the model
presented in this paper achieved a recognition rate of 80.4%.

Files

10009870.pdf

Files (297.8 kB)

Name Size Download all
md5:34db115bec1910fe0867c9b4fc9dfc4c
297.8 kB Preview Download