This is an old revision of the document!

Neural Organization of Signed and Spoken Language

Spoken Language

Spoken language operates through an auditory-vocal process. Anatomically, vocal cord vibrations (sound source) are sent through the filter of the vocal tract, subject to the shape of the pharynx and action of the glottis, tongue, jaw, and lips. Neurologically, spoken language production and comprehension entail processes of brain activity. Broca’s Area, located in the Left Frontal Lobe, is primarily responsible for speech production. Broca’s area is located near the motor cortex, which controls movements of the mouth and lips. Wernicke’s Area, located in the temporal lobe, is primarily responsible for comprehension. The auditory cortex is responsible for spoken sound processing, and is also located in the temporal Lobe.

Signed Language

Signed language operates through a visual-gestural process. While the visual system may assist comprehension in spoken language, it is essential in comprehension for signed language. The visual cortex is located in the occipital lobe which is in the posterior of the brain. For sign production, the motor cortex is utilized, which is in the frontal lobe. The frontal lobe is the site for motor planning and motor output. There are also contributions from the cerebellum, which controls fine motor movements.


There are many neurological and fundamental similarities between signed and spoken languages. Many studies have demonstrated left hemisphere dominance for both language systems. Rather than using the term “lateralization”, thinking of the left hemisphere as “specialized” for language can help illustrate dominance, but leaves room to explain certain processes that are in fact products of right hemisphere activity.

Both signed and spoken languages utilize Broca’s area and Wernicke’s area. Each also relies on processes of the motor cortex. Studies on patients with Broca’s and Wernicke’s aphasia reveal that similar deficits are present when comparing signing aphasics and speaking aphasics. Patients with Broca’s aphasia have trouble with both word and sign production and fluency, with similar results in comprehension and content in Wernicke’s aphasia.

Studies of specific left hemisphere and right hemisphere damaged patients reveal similarities as well. Left hemisphere damaged (LHD) patients experience paraphasias, “slips of the tongue” in speakers, and “slips of the hand” in signers. They also have issues with fluency and picture naming tasks. Speaking and signing RHD patients have difficulties with more global level holistic processes, such as following or maintaining a fluent discourse.

In addition, research shows major contributions from areas such as the basal ganglia and temporal gyrus for language, which also play a role in a variety of cognitive processes. These areas along with other multi-function brain regions support the notion of wide-reaching and dynamic capabilities of brain structures for varying modalities.

Simultaneous Learning

The ability for an individual to utilize both a spoken language and a signed language is known as bimodal bilingualism. It commonly occurs in hearing children of deaf parents, who learn spoken language from speaking relatives or teachers and sign language from their parents. This capability is a unique and interesting form of bilingualism because it requires distinct sensory-motor systems. Neural organization seems to lend itself to flexible progressions for controlling, processing, and representing two languages. Bimodal bilinguals display a relationship between linguistic and non-linguistic function and second language acquisition. Acquiring a signed language has shown to contribute to enhancements in a variety of non-linguistic visuospatial abilities. These abilities are related to processing requirements for sign language as well.


Emmorey, K., & McCullough, S. (2008). The bimodal bilingual brain: Effects of sign language experience. Brain & Language 117(2), 53-62. doi:10.1016/j.bandl.2008.03.005

Harley, T.A. (2008). The Psychology of Language London: Psychology Press.

Hickok, G., Bellugi, U., & Klima, E.S. (1996). The neurobiology of sign language and its implications for the neural basis of language. 699-702

Hickok, G., Bellugi, U., & Klima, E.S. (2002). Sign Language in the Brain. Retrieved from Scientific American Online. 46-53.

Swisher, Virginia M. (1998). Similarities and Differences between Spoken Languages and Natural Sign Languages Applied Linguistics 9(4): 343-356. doi:10.1093/applin/9.4.343