An annotated Japanese Sign Language Corpus

Autor/a: KOIZUMI, Atsuko; SAGAWA, Hirohiko; TAKEUCHI, Masaru
Año: 2002
Editorial: Tokyo, 2002
Tipo de código: Copyright
Soporte: Digital


Lingüística » Lingüística de otras Lenguas de Signos, Lingüística » Sistemas de transcripción de las Lenguas de Signos, Lingüística » Corpus signados


Sign language is characterized by its interactivity and multimodality, which cause difficulties in data collection and annotation. To address these difficulties, we have developed a video-based Japanese sign language (JSL) corpus and a corpus tool for annotation and linguistic analysis. As the first step of linguistic annotation, we transcribed manual signs expressing lexical information as well as non-manual signs (NMSs) - including head movements, facial actions, and posture - that are used to express grammatical information.

Our purpose is to extract grammatical rules from this corpus for the sign-language translation system underdevelopment. From this viewpoint, we will discuss methods for collecting elicited data, annotation required for grammatical analysis, as well as corpus tool required for annotation and grammatical analysis. As the result of annotating 2800 utterances, we confirmed that there are at least 50 kinds of NMSs in JSL, using head (seven kinds), jaw (six kinds), mouth (18 kinds), cheeks (one kind), eyebrows (four kinds), eyes (seven kinds), eye gaze (two kinds), bydy posture (five kinds). We use this corpus for designing and testing an algorithm and grammatical rules for the sign-language translation system underdevelopment.