Home Science They design the first app that translates Spanish sign language in real...

They design the first app that translates Spanish sign language in real time

They design the first app that translates Spanish sign language in real time

The Robotics and Three-Dimensional Vision Group (RoViT) at the University of Alicante (UA) has designed Sign4all, the first application capable of recognizing and interpreting in real time the Spanish sign language alphabet (known by the acronym LSE). This is an advance that contributes to breaking down communication barriers between deaf and hearing people in situations as everyday as going to the doctor’s office or eating at a restaurant.

Around 27,300 deaf people in Spain use sign language

According to the latest Survey on disability, personal autonomy and situations of dependence by the National Institute of Statistics (INE), in Spain there are 1,230,000 people with hearing impairments of different types and degrees. Of these, 27,300 people use sign language to communicate.

Thanks to the use of different computer vision and deep learning techniques, doctoral student in Computer Science and UA researcher Ester Martínez, together with doctoral student Francisco Morillas, have developed this low-cost tool to offer assistance to deaf people when they cannot be accompanied by an interpreter.

Sign4all, after capturing the person and extracting the details of the skeleton of the arms and hands, encodes the left part of the body in blue and the right part in red, maintaining the user’s anonymity at all times. Then, the application translates in real time the signal used by the deaf person and, in the opposite direction, is able to signal words in Spanish typed by the listener through a virtual avatar.

“The idea is that this whole process can be done by downloading an app and using the camera on your cell phone or tablet, so that it can be used anywhere,” says Martínez.

The app translates sign language into written Spanish in real time and vice versa

This project started two years ago, coinciding with the end-of-course project of Francisco Morillas, who is now doing his PhD, and came about after they designed an application in 2018 to monitor and perform physical exercise with the elderly and an interface for that these people can interact with it. Developing this interface, they asked themselves “what happens if the person the robot is trying to monitor has some kind of visual or hearing impairment”, explains the scientist.

Sign4all can interpret and recognize the LSE alphabet with an accuracy of 80%

Ester Martinez

“After many tests, Sign4all can interpret and recognize the LSE alphabet with an accuracy of 80%”, he explained. “Although this result corresponds to the data alphabet, we are working on a version with a specific vocabulary belonging to the field of daily tasks, where complete sentences can be interpreted”, he points out.

The UA work team has been “training” this new system for months and introducing more and more signals. In this sense, a collaboration has emerged with the Research Group on Spanish Language and Sign Languages ​​(GRILES) of the University of Vigo, a group with extensive experience in the study of this language and its use in different territories.

vocabulary expansion

“At the University of Vigo they are collecting images with interpreters and we are processing all this data. In this way, we can expand the vocabulary of our LSE recognition and interpretation system much more quickly and improve its functioning”, says Martínez.

The application they created for the deaf makes it possible, for now, to recognize the alphabet of the sign language and to use it, but the two researchers from the UA already have a new corpus of food vocabulary data and will add, with some tests that they are doing to the his colleagues at the University of Vigo, expressions of social interaction.

“Right now, the only thing that is finalized and completed is what concerns the alphabet, although we are trying to expand it so that the application is really useful and reaches as many people as possible”, concludes the researcher.

Reference:

Martínez, E. “Deep Learning Techniques for Spanish Sign Language Interpretation”. Computational intelligence and neuroscience (2023).

No Comments

Leave A Reply

Please enter your comment!
Please enter your name here

Exit mobile version