Evidence for shared conceptual representations for sign and speech

Autor/a: EVANS, Samuel; PRICE, Cathy; DIEDRISCHEN, Jörn; GUTIERREZ SIGUT, Eva; MACSWEENEY, Mairéad
Año: 2019
Editorial: bioRxiv: The preprint server for biology
Tipo de código: Copyright
Soporte: Digital

Temas

Educación » Adquisición y desarrollo del lenguaje, Educación » Aspectos psicológicos y cognitivos

Detalles

Do different languages evoke different conceptual representations? If so, greatest divergence might be expected between languages that differ most in structure, such as sign and speech. Unlike speech bilinguals, hearing sign-speech bilinguals use languages conveyed in different modalities. We used functional magnetic resonance imaging and representational similarity analysis (RSA) to quantify the similarity of semantic representations elicited by the same concepts presented in spoken British English and British Sign Language in hearing, early sign-speech bilinguals. We found shared representations for semantic categories in left posterior middle and inferior temporal cortex. Despite shared category representations, the same spoken words and signs did not elicit similar neural patterns. Thus, contrary to previous univariate activation-based analyses of speech and sign perception, we show that semantic representations evoked by speech and sign are only partially shared. This demonstrates the unique perspective that sign languages and RSA provide in understanding how language influences conceptual representation.