Syntax-aware Transformers for Neural Machine Translation: The Case of Text to Sign Gloss Translation

Autor/a: EGEA, Santiago; MCGILL, Euan; SAGGION, Horacio
Año: 2021
Editorial: 14th WORKSHOP ON BUILDING AND USING COMPARABLE CORPORA, RANLP
Tipo de código: Copyright
Soporte: Digital

Temas

Lingüística » Sistemas de transcripción de las Lenguas de Signos, Medios de comunicación y acceso a la información » Nuevas Tecnologías

Detalles

It is well-established that the preferred mode of communication of the deaf and hard of hear- ing (DHH) community are Sign Languages (SLs), but they are considered low resource languages where natural language processing technologies are of concern. In this paper we study the problem of text to SL gloss Machine Translation (MT) using Transformer-based ar- chitectures. Despite the significant advances of MT for spoken languages in the recent cou- ple of decades, MT is in its infancy when it comes to SLs. We enrich a Transformer-based architecture aggregating syntactic information extracted from a dependency parser to word- embeddings. We test our model on a well- known dataset showing that the syntax-aware model obtains performance gains in terms of MT evaluation metrics.

Ubicación