Neuropsychological studies of lingüistic and affective facial expressions in deaf signers
Temas
Detalles
For deaf users of American Sign Language (ASL), facial behaviors function in two distinct ways: to convey affect (as with spoken languages) and to mark certain specific grammatical structures (e.g., relative clauses), thus subserving distinctly linguistic functions in ways that are unique to signed languages. The existence of two functionally different classes of facial behaviors raises questions concerning neural control of language and nonlanguage functions. Examining patterns of neural mediation for differential functions of facial expressions, linguistic versus affective, provides a unique perspective on the determinants of hemispheric specialization. This paper presents two studies which explore facial expression production in deaf signers.1 An experimental paradigm uses chimeric stimuli of ASL linguistic and affective facial expressions (photographs of right vs. left composites of posed expressions) to explore patterns of productive asymmetries in brain-intactsigners. A second study examines facial expression production in left and right brain lesioned deaf signers, specifying unique patterns of spared and impaired functions. Both studies show striking differences between affective and linguistic facial expressions. The data indicate that for deaf signing individuals, affective expressions appearto be primarily mediated by the right-hemisphere. In contrast, these studies provide evidence that linguistic facial expressions involve left hemisphere mediation.