This post is also available in
Sign languages are independent languages used by deaf communities and most countries have their own sign language. What can sign languages teach us about the brain?
“I love you” in American Sign Language. Picture taken by Wes Peck (CC By-ND 2.0)
Sign language is much more than a loose collection of pantomime-like gestures. It is highly structured and has its own complex grammar. Just as spoken English and spoken Dutch have their own rules to formulate words and sentence, the same applies to American and Dutch Sign Language which have their own rules for producing signs and signed sentences. However, it is wrong to think that signed languages only differ from spoken languages in the medium of communication, i.e. through the hands. For example, American Sign Language is not simply English expressed with hand but also uses facial expressions and gazes to communicate. This, together with its own structure and grammar is what makes American Sign Language a unique and independent language. Most other sign languages in the world are also unique in this manner.
Is a signing brain different from a speaking brain?
It might be surprising, but usually it does not differ. It seems that the brain processes all languages similarly regardless of whether they are spoken or signed. How do we know this?
Research has shown that brain damage to language-related areas results in the same language problems in hearing individuals as in deaf individuals. For example, both hearing and deaf patients who suffered from brain damage to the left frontal lobe, had problems with speaking or signing. Next to these brain damage studies, technology that measures brain activity can be used to explore how certain brain areas are engaged during language processing in healthy brains. Researchers using functional magnetic resonance imaging (fMRI) have found similar active brain regions in deaf and hearing people when processing words or signs. For example, a research team from San Diego scanned the brains of deaf and hearing participants while they read English words and sentences. Both groups activated a region in the left hemisphere of the brain, known as the Visual Word Form Area. These similar patterns of activation in deaf and hearing readers suggest that when reading words and sentences, brain responses do not depend on the ability to pronounce words.
So, what can we learn from sign languages?
Languages are processed in the same areas of the brain, regardless of whether someone is speaking English or American Sign Language. This suggests that the brain does not distinguish whether people use their voices or their hands to talk.
This blog has been written by Francie Manhardt. She is a PhD student working within the Multimodal Language and Cognition Group at Radboud University Nijmegen. Her research focuses on spatial relation in sign languages.
Edited by Kasia en Nietzsche.