IPN Students Develop Glove That Converts Mexican Sign Language into Speech

• Facilitates communication with people with hearing impairments and is built on a database of gestures trained using a machine learning algorithm, a branch of artificial intelligence (AI)

• The gesture recognizer uses position sensors and was created by UPIITA students who succeeded in translating key words of Mexican Sign Language into voice

Key words and phrases from Mexican Sign Language (LSM) now have voice translation, thanks to a project developed by students at the Instituto Politécnico Nacional (IPN). By integrating programming, digital electronics, and Artificial Intelligence, among other technologies, the device aims to facilitate communication and social inclusion for people with hearing disabilities.

The project, titled “Gesture Recognizer for Mexican Sign Language Using Myoelectric Signals and Position Sensors,” was developed by Valeria Ramos Vázquez and Pedro Martín Morales Flores, Bionics Engineering students, as part of their degree requirements at the Unidad Profesional Interdisciplinaria en Ingeniería y Tecnologías Avanzadas (UPIITA).

This project aligns with the educational, scientific, and technological directives established by President Claudia Sheinbaum Pardo and the Secretary of Public Education (SEP), Mario Delgado Carrillo.

The prototype recognizes certain Mexican Sign Language gestures through two data acquisition systems: a glove embedded with position sensors and two myoelectric signal circuits (small currents generated by muscles during movement) attached to a person’s arm and forearm.

The students explained that, while similar devices exist in countries such as the United States and China, Mexico still lacks widespread solutions.

With the support of the organization Latido Sordo A.C. in Querétaro and the governmental resource “Manos con voz. Diccionario de Lengua de Señas Mexicana,” Ramos Vázquez and Morales Flores defined 13 common phrases broken down into gestures: good morning; good night; eat; night; how; to be; I help you; can you help me; please; need; bad; good; and you’re welcome.

From this list, they created a gesture database trained using machine learning, a branch of Artificial Intelligence.

The project also integrates technologies for Mexican Sign Language, gesture concatenation methods to form sentences, and myoelectric signals.

When a gesture is performed, the signals are filtered and processed, features are extracted, and they are compared with the database. At that point, the system selects the closest match and reproduces the audio. Dr. Blanca Tovar Corona, from the Academy of Systems and the Graduate Program in Advanced Technology, Artificial Intelligence, and Data Science at UPIITA, noted that the project integrates motion sensors and electromyography signals, allowing for the detection of finger and arm movements.

Dr. Álvaro Anzueto Ríos, a member of the Bionics Academy at UPIITA, explained that the system can learn to differentiate gestures, not only recognizing the movement but also its meaning and separating it from others.

For more information, visit www.ipn.mx