Multiple Brain Regions Wired for Language
- Publish Date: 2010/04/30
- Author: University of Rochester
Outline: No single advanced area of the human brain that gives it language capabilities above and beyond those of any other animal species.
Main DigestSign Language Study Shows Multiple Brain Regions Wired for Language
A new study from the University of Rochester finds that there is no single advanced area of the human brain that gives it language capabilities above and beyond those of any other animal species.
Instead, humans rely on several regions of the brain, each designed to accomplish different primitive tasks, in order to make sense of a sentence. Depending on the type of grammar used in forming a given sentence, the brain will activate a certain set of regions to process it, like a carpenter digging through a toolbox to pick a group of tools to accomplish the various basic components that comprise a complex task.
"We're using and adapting the machinery we already have in our brains," said study coauthor Aaron Newman. "Obviously we're doing something different [from other animals], because we're able to learn language unlike any other species. But it's not because some little black box evolved specially in our brain that does only language, and nothing else."
The team of brain and cognitive scientists - comprised of Newman (now at Dalhousie University after beginning the work as a postdoctoral fellow at the University of Rochester), Elissa Newport (University of Rochester), Ted Supalla (University of Rochester), Daphne Bavelier (University of Rochester), and Peter Hauser (Rochester Institute of Technology) - published their findings in the latest edition of the journal Proceedings of the National Academies of Science .
To determine whether different brain regions were used to decipher sentences with different types of grammar, the scientists turned to American Sign Language for a rare quality it has.
Some languages (English, for example) rely on the order of words in a sentence to convey the relationships between the sentence elements. When an English speaker hears the sentence "Sally greets Bob," it's clear from the word order that Sally is the subject doing the greeting and Bob is the object being greeted, not vice versa.
Other languages (Spanish, for example) rely on inflections, such as suffixes tacked on to the ends of words, to convey subject-object relationships, and the word order can be interchangeable.
American Sign Language has the helpful characteristic that subject-object relationships can be expressed in either of the two ways - using word order or inflection. Either a signer can sign the word "Sally" followed by the words "greets" and "Bob" (a construction in which word order dictates meaning), or the signer can use physical inflections such as moving hands through space or signing on one side of the body to convey the relationship between elements. For the study, the team formed 24 sentences and expressed each of those sentences using both methods.
Videos of the sentences being signed were then played for the subjects of the experiment, native signers who were lying on their backs in MRI (magnetic resonance imaging) machines with coils around their heads to monitor which areas of the brain were activated when processing the different types of sentences.
The study found that there are, in fact, distinct regions of the brain that are used to process the two types of sentences: those in which word order determined the relationships between the sentence elements, and those in which inflection was providing the information.
In fact, Newman said, in trying to understand different types of grammar, humans draw on regions of the brain that are designed to accomplish primitive tasks that relate to the type of sentence they are trying to interpret. For instance, a word order sentence draws on parts of the frontal cortex that give humans the ability to put information into sequences, while an inflectional sentence draws on parts of the temporal lobe that specialize in dividing information into its constituent parts, the study demonstrated.
"These results show that people really ought to think of language and the brain in a different way, in terms of how the brain capitalizes on some perhaps preexisting computational structures to interpret language," Newport said.
Aside from providing perspective on how language abilities might have evolved in humans, the scientists' findings could perhaps eventually find applications in medicine, according to Newport. For instance, it could prove valuable in assessing how best to teach language to a person with brain damage in certain areas but not others, such as a stroke victim.
- 1 - Purple Enhances P3 Deaf and Hearing Impaired Communications Application | Purple Communications | 2010/01/21
- 2 - British Sign Language: Facts and Information | Neil Payne/George Spence | 2009/01/20
- 3 - American Sign Language Facts and Information | Disabled World | 2009/01/20
- 4 - SRV Canada Video Relay Service for People with Hearing Disability | Canadian Administrator of VRS (CAV), Inc. | 2016/09/29
- 5 - Policy Changes Needed for American Sign language and English Language Learners | Linguistic Society of America | 2018/06/11
- 6 - Study Into How Children With Cochlear Implants Learn Words | MediaSource | 2017/02/27
- 7 - Indian Sign Language Dictionary To Be Release In India Soon | Press Trust of India | 2017/03/22
- 8 - Using Sign Language Builds Phrases with Similar Neural Mechanisms as Speaking | New York University | 2018/04/03
Help Spread Disability AwarenessConnect with Us on Social Media