SignReality

Extended Reality for Sign Language translation

The research project “SignReality – Extended Reality for Sign Language Translation” aims to develop a model and an augmented reality (AR) application that visualizes an animated interpreter for German Sign Language (DGS). The project is a collaboration between the DFKI-DRX department and the Affective Computing group of DFKI-COS, and is part of the broader DFKI Sign Language team’s activities, which span four departments and have conducted two EU-funded and two Germany-funded research projects.

The app developed within SignReality will enable deaf and hard-of-hearing users to have a personal interpreter in augmented or virtual space who can translate spoken language and text. Users will be able to position and resize the interpreter according to their translation needs, based on the observation that for signers it is, for instance, more effective to have the interpreter placed next to the speaking person to improve comprehension by maintaining a direct view of the speaker. The application will serve as a research prototype for exploring new methods of interaction and content delivery between deaf and hard-of-hearing users and their surrounding environment, with the goal of reducing communication barriers with hearing individuals.

Duration: March 2024 – October 2024

Funding: FSTP call of the EU project UTTER (Unified Transcription and Translation for Extended Reality; EU Horizon, Grant Agreement No. 101070631), in collaboration with the Universities of Amsterdam and Edinburgh.

Partners: University of Edinburgh, University of Amsterdam, IT Lisbon, NAV Labs Europe, and Unbabel

Official Homepage: https://www.dfki.de/web/forschung/projekte-publikationen/projekt/signreality