Previous talks at the SCCS Colloquium

Maximilian Karpfinger: Sign Language Translation on Mobile Devices

SCCS Colloquium |


It is very important for the people of the deaf community to communicate with each other and also with people that are not able to speak sign languages, but the language barrier makes this difficult. However, automated translation of sign language to spoken language is a difficult task. For the time being, there is not one application which does sign language translation in this way. There are some approaches which can detect and translate alphabetical characters but not full sentences. While there are many tools that support people who need seeing or reading assistance like the Google Assistant, the number of tools available for the deaf community is still very small. Though, with new technologies and smartphones more opportunities open up. This thesis is about translating sign language on mobile devices. In particular an app for Android smartphones was created that can translate sign language to text in real-time. The framework behind this technology is the framework MediaPipe by Google. By using holistic tracking, hand, facial and pose landmarks are extracted from a camera stream and put into a model that maps these landmarks to translated German text which will then be displayed on the smartphone's display. Three different models were trained with the data provided by the DGS-Korpus which is a long-term project by the Akademie der Wissenschaften Hamburg on sign languages. As a result, an Android app that uses a neural network trained on the 100 most appearing signs in the available videos of the DGS-Korpus to translate pose landmarks to words was successfully implemented. The network has approximately 30% accuracy, so a lot of improvements are still possible.

Master's thesis presentation. Maximilian is advised by Dr. Felix DIetrich.