Task 4:
Language
1. How does language develop (+ in relation to the brain areas)?
Linguists break language down differently. They view words as consisting of fundamental language
sounds, called phonemes that form a word or part of a word.
Components of a sound-based language:
Phonemes: individual sound units whose concatenation, in particular order produces
morphemes
Morphemes: smallest meaningful units of a word, whose combination forms a word
Lexicon: collection of all words in a given language; each lexical entry includes all
information with morphological or syntactical ramifications but does not include
conceptual knowledge
Syntax: grammar; admissible combinations of words in phrases and sentences
Semantics: meanings that corresponds to all lexical items and all possible sentences
Prosody: vocal intonation which can modify the literal meaning of words and sentences
Discourse: linking sentences to constitute a narrative
Another characteristic of human language is its use of syllables made up of consonants
(medeklinkers) and vowels (klinkers; a-e-i-o-u). Our mouths are capable of producing consonants
and combining them with vowels to produce syllables.
Core language skills
Four core skills underlie human language:
1. Categorizing: as the cortex expands and the number of channels that process parallel
sensory information increases, binding the information into a single perception of reality
becomes difficult. Thus, it is necessary to categorize information. Assigning tags to
information makes it easier to receive the information and to retrieve it later when
needed. The ventral visual stream coursing through the temporal lobes participates in
object categorization, and the dorsal stream may also participate by making relatively
automatic distinctions between objects, such as plant versus animal or human versus
nonhuman.
2. Category labeling: the development of human language may have entailed selection for
novel means of categorization that not only allowed for combining and grouping simple
sensory stimuli but also provided a means of organizing events and relations. This
categorizing system can stimulate the production of word forms about that concept (the
category); conversely, it can cause the brain to evoke the concepts in words. Thus, labeling
a category includes not only identifying it, a function of the temporal lobes, but also
organizing information within the category. This is a function of the motor cortices in the
frontal lobes within the dorsal visual stream.
3. Sequencing behaviors: human language employs transitional larynx movements to form
syllables. Left-hemisphere structures associated with language form part of system that has
a fundamental role in ordering vocal movements such as those used in speech. Sequencing
, words to represent meaningful actions likely makes use of the same dorsal-stream frontal
cortex circuits that sequence motor action more generally.
4. Mimicry: from birth babies show a preference for listening to speech over other sounds.
Babies mimic and subsequently prefer the language sounds made by the people in their
lives. One view related to mimicry is that mirror neurons in the cortical language regions
are responsible for our ability to mimic the sounds, words and actions that comprise
language.
Brain areas associated with language
Some researchers refer to sulci, others to Brodmann’s areas
and still others to areas associated with syndromes such as
Broca’s area and Wernicke’s area.
Figure 19.6A includes the inferior gyrus and the
superior temporal gyrus, in which Broca’s area (green)
and Wernicke’s area (yellow) respectively are located.
Parts of surrounding gyri, including the ventral parts of
the precentral and postcentral gyri, the supramarginal
gyrus, the angular gyrus, and the medial temporal
gyrus, also lie within the core language regions.
Figure 19.6B depicts the language areas in accord with
Brodmann’s mapping. Broca’s area includes areas 45
and 44, and Wernicke’s area includes area 22. These
numbers are maps to navigate the brain. Brodmann gave every part of the brain a number.
In figure 19.6C the lateral fissure is retracted, showing the language related areas found
within in, including the insula, a large region of the gyrus (primary auditory cortex); and
parts of the superior temporal gyrus referred to as the anterior and posterior superior
temporal planes.
Still other regions taking part in language include dorsal premotor area 6. Responsible for the
rhythmic mouth movements that articulate sounds; parts of the thalamus, dorsolateral parts of the
caudate nucleus, and the cerebellum; visual areas, sensory pathways, and motor pathways; and
pathways connecting all of these various regions. In addition, many right-hemisphere regions
participate in language.
Neural connections between language zones
Wenicke’s early model of language and its revival by Geschwind known as the Wernicke-
Geschwind model, were both based entirely on lesion data. This three-part model proposes that
comprehension is:
1. Extracted from sounds in Wernicke’s area
2. Passed over the arcuate fasciculus pathway to,
3. Broca’s area to be articulated as speech.
The Wernicke-Geschwind model has played a formative role in directing language research and
organizing research results. A contemporary language model, based on recent anatomical and
behavioral studies, is illustrated in the figure.
As proposed, the temporal and frontal cortices are connected by pairs of dorsal and ventral
language pathways, which are viewed as extensions of the dorsal and ventral visual streams. The
, double-headed arrows on both paired pathways indicate that information flows both ways
between the temporal and frontal cortex.
Information from vision enters into the auditory language pathways via the dorsal and
ventral visual streams and contributed to reading.
Information from body-sense regions of the parietal cortex also contributes to the dorsal
and ventral language pathways and likely contributes to touch language such as braille.
The dual stream model
The dual stream model of afferent information processing is similarly applied to auditory
processing system in which the ventral stream processes “what” and the dorsal processes
“where”.
The dual stream model is extended to explain cortical organization of language. The processing system
diverges into two streams; a ventral stream which maps sound onto meaning, and a dorsal stream which
maps sound onto articulatory-based representations to yield production.
The ventral stream is thus a sound-meaning interface, responsible for processing speech signals
for comprehension.
The dorsal stream translates acoustic speech signals into articulatory representations, involving
auditory-motor integrations.
The dual streams are also thought to be bi-directional; the ventral stream mediates the relationship
between sound and meaning for perception and production, and the dorsal system can also map motor
speech representations onto auditory speech representations. The ventral stream projects ventro-
laterally and involves cortex in the superior temporal sulcus and the posterior inferior temporal lobe.
The dorsal stream projects dorso-posteriorly toward the parietal lobe and ultimately to frontal regions.
The dorsal stream is strongly left dominant, accounting for speech production deficits that are seen with
dorsal temporal and frontal lesions.
The dorsal language pathways are proposed to transform sound information into motor
representation (to convert phonological information into articulation). The ventral language paths
are proposed to transform sound information into meaning (to convert phonological information
into semantic information). Information flow in the dorsal pathways is bottom-up, and occurs
when we are asked to repeat non-sense words or phrases.
Thus, the temporal cortex assembles sounds by phonetic structure and passes them along
to the frontal cortex for articulation. No meaning is assigned to sounds in this pathway.
Information flow in the ventral pathway is proposed to be more top-down, assigning meaning to
words and phrases, as occurs when we assign a specific meaning to a word.
The dorsal and ventral language pathways are engaged in syntax, with the dorsal pathway
categorizing sounds in terms of frequency of association and the ventral pathway extracting
meaning from the grammatical organization of words. Both sets of language pathways are also
proposed to be involved in short- and long-term memory for the phonetic and semantic
components of speech. Nonverbal speech including reading and sign language from visual cortex
and braille from parietal cortex, also uses these pathways.
Dorsal language pathways Ventral language pathways
Transform sound information into motor Transform sound information into meaning
representation
Convert phonological information into Convert phonological information into