Brain Processes Sign Language and Spoken Phrases Similarly
Topic: Deaf Communication
Author: New York University
Published: 2018/04/03 - Updated: 2024/10/01
Publication Type: Informative - Peer-Reviewed: Yes
Contents: Summary - Introduction - Main Item - Related Topics
Synopsis: Despite physical differences in how signed and spoken languages are produced and comprehended, neural timing and localization of planning of phrases is comparable between ASL and English. Although there are many reasons to believe that signed and spoken languages should be neurobiologically quite similar, evidence of overlapping computations at this level of detail is still a striking demonstration of the fundamental core of human language.
Why it matters: This study is significant because it reveals groundbreaking research demonstrating that the neural mechanisms underlying phrase construction in sign language and spoken language are remarkably similar. Despite the obvious physical differences between signing and speaking, the study found that both engage the same brain regions with similar timing when building complex linguistic expressions. This discovery provides strong evidence for the fundamental similarities in how the human brain processes language, regardless of its modality. The research not only deepens our understanding of language processing in the brain but also highlights the importance of studying sign languages to uncover universal aspects of human communication. These findings have implications for our understanding of language evolution, language acquisition, and potentially for developing more effective therapies for language disorders across different linguistic modalities - Disabled World.
Introduction
When we sign, we build phrases with similar neural mechanisms as when we speak. Differences between signed and spoken languages are significant, yet the underlying neural processes we use to create complex expressions are quite similar for both, a team of researchers has found.
Main Item
"This research shows for the first time that despite obvious physical differences in how signed and spoken languages are produced and comprehended, the neural timing and localization of the planning of phrases is comparable between American Sign Language and English," explains lead author Esti Blanco-Elorrieta, a doctoral student in New York University's Department of Psychology and NYU Abu Dhabi Institute.
The research is reported in the latest issue of the journal Scientific Reports.
"Although there are many reasons to believe that signed and spoken languages should be neurobiologically quite similar, evidence of overlapping computations at this level of detail is still a striking demonstration of the fundamental core of human language," adds senior author Liina Pylkkanen, a professor in New York University's Department of Linguistics and Department of Psychology.
The study also included Itamar Kastner, an NYU doctoral student at the time of the study and now at Berlin's Humboldt University, and Karen Emmorey, a professor at San Diego State University and a leading expert on sign language, who adds;
"We can only discover what is universal to all human languages by studying sign languages."
Past research has shown that structurally, signed and spoken languages are fundamentally similar. However, less clear is whether the same circuitry in the brain underlies the construction of complex linguistic structures in sign and speech.
To address this question, the scientists studied the production of multiple two-word phrases in American Sign Language (ASL) as well as speech by deaf ASL signers residing in and around New York and hearing English speakers living in Abu Dhabi.
Signers and speakers viewed the same pictures and named them with semantically identical expressions.
In order to gauge the study subjects' neurological activity during this experiment, the researchers deployed magnetoencephalography (MEG), a technique that maps neural activity by recording magnetic fields generated by the electrical currents produced by our brain.
For both signers and speakers, phrase building engaged the same parts of the brain with similar timing: the left anterior temporal and ventromedial cortices, despite different linguistic articulators (the vocal tract vs. the hands).
The researchers point out that this neurobiological similarity between sign and speech, then, goes beyond basic similarities and into more intricate processes-the same parts of the brain are used at the same time for the specific computation of combining words or signs into more complex expressions.
The research was supported by grants from National Science Foundation (BCS-1221723) (LP), the National Institutes of Health (R01-DC010997) (KE), and the NYUAD Institute as well as a La Caixa Foundation fellowship for Post-Graduate Studies (EBE).
Attribution/Source(s):
This peer reviewed publication was selected for publishing by the editors of Disabled World due to its significant relevance to the disability community. Originally authored by New York University, and published on 2018/04/03 (Edit Update: 2024/10/01), the content may have been edited for style, clarity, or brevity. For further details or clarifications, New York University can be contacted at nyu.edu. NOTE: Disabled World does not provide any warranties or endorsements related to this article.
Explore Related Topics
1 - Hard-of-hearing Prefer a Different Sound When It Comes to Music - Contemporary music can pose challenges for individuals with hearing impairments, but adjustments in sound mixing could potentially create a positive impact.
2 - The Brain Treats Hearing in a Crowded Room Differently - The brain treats speech in a crowded room differently depending on how easy it is to hear, and whether we are focusing on it.
3 - Self-taught Homesigning Deaf Children Supports Universal Language Constraints - Deaf homesigners offer a unique window into whether there are universals for how people use language to talk about ideas.
4 - Humans Still Understand Chimpanzee and Bonobo Gestures - The discovery of gestures used by great apes provides evidence of intentional communication outside human language; over 80 such signals have now been identified.
5 - Struggling to Hear at Work - Visualise Training and Consultancy takes a look at the challenges employees with hearing loss face in the workplace.
Page Information, Citing and Disclaimer
Disabled World is a comprehensive online resource that provides information and news related to disabilities, assistive technologies, and accessibility issues. Founded in 2004 our website covers a wide range of topics, including disability rights, healthcare, education, employment, and independent living, with the goal of supporting the disability community and their families.
Cite This Page (APA): New York University. (2018, April 3 - Last revised: 2024, October 1). Brain Processes Sign Language and Spoken Phrases Similarly. Disabled World. Retrieved October 6, 2024 from www.disabled-world.com/disability/types/hearing/communication/neural-process.php
Permalink: <a href="https://www.disabled-world.com/disability/types/hearing/communication/neural-process.php">Brain Processes Sign Language and Spoken Phrases Similarly</a>: Despite physical differences in how signed and spoken languages are produced and comprehended, neural timing and localization of planning of phrases is comparable between ASL and English.
Disabled World provides general information only. Materials presented are never meant to substitute for qualified medical care. Any 3rd party offering or advertising does not constitute an endorsement.