Evidence for shared conceptual representations for sign and speech
Temas
Detalles
Do different languages evoke different conceptual representations? If so, greatest divergence might be expected between languages that differ most in structure, such as sign and speech. Unlike speech bilinguals, hearing sign-speech bilinguals use languages conveyed in different modalities. We used functional magnetic resonance imaging and representational similarity analysis (RSA) to quantify the similarity of semantic representations elicited by the same concepts presented in spoken British English and British Sign Language in hearing, early sign-speech bilinguals. We found shared representations for semantic categories in left posterior middle and inferior temporal cortex. Despite shared category representations, the same spoken words and signs did not elicit similar neural patterns. Thus, contrary to previous univariate activation-based analyses of speech and sign perception, we show that semantic representations evoked by speech and sign are only partially shared. This demonstrates the unique perspective that sign languages and RSA provide in understanding how language influences conceptual representation.