Neural Organization of Signed and Spoken Language

Spoken Language

Spoken language operates through an auditory-vocal process. Anatomically, vocal cord vibrations (sound source) are sent through the filter of the vocal tract, subject to the shape of the pharynx and action of the glottis, tongue, jaw, and lips. Neurologically, spoken language production and comprehension entail processes of brain activity. Broca’s Area, located in the Left Frontal Lobe, is primarily responsible for speech production. Broca’s area is located near the motor cortex, which controls movements of the mouth and lips. Wernicke’s Area, located in the temporal lobe, is primarily responsible for comprehension. The auditory cortex is responsible for spoken sound processing, and is also located in the temporal Lobe.

Signed Language

Signed language operates through a visual-gestural process. While the visual system may assist comprehension in spoken language, it is essential in comprehension for signed language. The visual cortex is located in the occipital lobe which is in the posterior of the brain. For sign production, the motor cortex is utilized, which is in the frontal lobe. The frontal lobe is the site for motor planning and motor output. There are also contributions from the cerebellum, which controls fine motor movements.


There are many neurological and fundamental similarities between signed and spoken languages. Many studies have demonstrated left hemisphere dominance for both language systems. Rather than using the term “lateralization”, thinking of the left hemisphere as “specialized” for language can help illustrate dominance, but leaves room to explain certain processes that are in fact products of right hemisphere activity.

Both signed and spoken languages utilize Broca’s area and Wernicke’s area. Each also relies on processes of the motor cortex. Studies on patients with Broca’s and Wernicke’s aphasia reveal that similar deficits are present when comparing signing aphasics and speaking aphasics. Patients with Broca’s aphasia have trouble with both word and sign production and fluency, with similar results in comprehension and content in Wernicke’s aphasia.

Studies of specific left hemisphere and right hemisphere damaged patients reveal similarities as well. Left hemisphere damaged (LHD) patients experience paraphasias, “slips of the tongue” in speakers, and “slips of the hand” in signers. They also have issues with fluency and picture naming tasks. Speaking and signing RHD patients have difficulties with more global level holistic processes, such as following or maintaining a fluent discourse.

In addition, research shows major contributions from areas such as the basal ganglia and temporal gyrus for language, which also play a role in a variety of cognitive processes. These areas along with other multi-function brain regions support the notion of wide-reaching and dynamic capabilities of brain structures for varying modalities.


The increased use of fMRI images while conducting such research allows for a deeper look into the exact brain areas activated and suggest the neural organization of language is not as similar as once thought. Due to recent work with fMRI studies, it is suggested that there are differences in hemisphere brain responses. fMRI activations during perception of both spoken and sign language in bilingual hearing participants depicted that some areas of the brain were activated more than other for either spoken v. sign. Interestingly, the fMRI studies show that sign perception involves both hemispheres and is more bilateral than spoken language. Spoken language activates other specific sub hemispheric areas and vice versa for sign language.

Simultaneous Learning

The ability for an individual to utilize both a spoken language and a signed language is known as bimodal bilingualism. It commonly occurs in hearing children of deaf parents, who learn spoken language from speaking relatives or teachers and sign language from their parents. This capability is a unique and interesting form of bilingualism because it requires distinct sensory-motor systems. Neural organization seems to lend itself to flexible progressions for controlling, processing, and representing two languages. Bimodal bilinguals display a relationship between linguistic and non-linguistic function and second language acquisition. Acquiring a signed language has shown to contribute to enhancements in a variety of non-linguistic visuospatial abilities. These abilities are related to processing requirements for sign language as well.

Contrary to common misconceptions, hearing children of deaf parents do not struggle to learn spoken language, nor are they developmentally delayed in the acquisition of spoken language when compared to their monolingual speaking peers. Research suggests that if a bimodal bilingual child's language input consists of a minimum of 20% spoken language, they will hit developmental language milestones around the same time as monolingual speaking children of the same age. This simultaneous learning of language can be attributed to a method known as fast mapping - children pair novel phonological forms and sounds with semantic representations after very little exposure to the word and little to no exposure to the referent (using non linguistic cues such as eye gaze, gestures, etc.)


Brackenbury T, Ryan T, Messenheimer T (2006) Incidental word learning in a hearing child of deaf adults. Journal of Deaf Studies and Deaf Education(11), 76–93.

Emmorey, K., & McCullough, S. (2008). The bimodal bilingual brain: Effects of sign language experience. Brain & Language 117(2), 53-62. doi:10.1016/j.bandl.2008.03.005

Harley, T.A. (2008). The Psychology of Language London: Psychology Press.

Hickok, G., Bellugi, U., & Klima, E.S. (1996). The neurobiology of sign language and its implications for the neural basis of language. 699-702

Hickok, G., Bellugi, U., & Klima, E.S. (2002). Sign Language in the Brain. Retrieved from Scientific American Online. 46-53.

Swisher, Virginia M. (1998). Similarities and Differences between Spoken Languages and Natural Sign Languages Applied Linguistics 9(4): 343-356. doi:10.1093/applin/9.4.343