HANDBOOK ON THE NEUROPSYCHOLOGY OF AGING AND DEMENTIA – Lisa D.
Ravdin and Heather L. Katzen
https://link-springer-com.proxy-ub.rug.nl/content/pdf/10.1007%2F978-3-319-93497-6.pdf
WEEK 2
Chapter 2 Consideration of Cognitive Reserve
Introduction to Cognitive Reserve
The idea of reserve against brain damage stems from the repeated observation that there is not a
direct relationship between degree of brain pathology or damage and the clinical manifestation of
that damage. How does brain function and structure become decoupled and whether certain person-
specific variables provide reserve against the clinical effects of pathological brain changes? Several
theoretical models have been put forth to address this issue.
The cognitive reserve (CR) model suggests that the brain actively attempts to cope with brain damage
by using preexisting cognitive processing. The cognitive reserve (CR) model suggests that the brain
actively attempts to cope with brain damage by using preexisting cognitive processing. Individuals
with high CR would be more successful at coping with the same amount of brain damage than those
with low CR. In this scenario, brain function rather than brain size is the relevant variable. This
characteristic distinguishes the CR model from the brain reserve model in which reserve derives from
brain size or neuronal count. According to the CR model, the same amount of brain damage or
pathology will have different effects on different people, even when brain size is held constant.
Many studies have demonstrated the beneficial effects of education, occupation, leisure, and
intellectual ability on dementia incidence.
To the extent that aspects of educational and occupational attainment reflect lifetime exposures that
would increase CR, it would be logical to expect that environmental exposures later in life would also
be beneficial.
A meta-analysis examining cohort studies of the effects of education, occupation, premorbid IQ, and
mental activities on dementia risk over approximately 7 years revealed that 25 of 33 datasets
demonstrated a significant protective effect of these variables. The summary overall risk of incident
dementia for individuals with high levels of the protective variable as compared to low was 0.54, a
decreased risk of 46%. There is also evidence for the role of education in age-related cognitive
decline, with many studies of normal aging reporting slower cognitive and functional decline in
individuals with higher educational attainment. These studies suggest that the same factors that
delay the onset of dementia also allow individuals to cope more effectively with brain changes
encountered in normal aging.
The concept of CR provides a ready explanation for the manner in which intellectual functioning,
education, and other life experiences may allow individuals to sustain greater burdens of brain
pathology or age-related changes before demonstrating cognitive and functional deficits.
Neuroimaging studies have also provided evidence in support of cognitive reserve and have
contributed to our conceptualization of this phenomenon.
With brain pathology held constant, higher education was associated with better cognitive function
and less likelihood of having received a clinical diagnosis of dementia in life. These studies converge
nicely with epidemiological evidence that supports that higher levels of education, occupational
,attainment, and leisure activity reduce dementia incidence and suggest that these variables influence
dementia risk by enhancing cognitive reserve.
Theoretical issues
Despite the wealth of information that has accumulated in support of the concept of cognitive
reserve, there are many aspects of this construct that have yet to be fully elaborated.
First, the precise manner in which cognitive reserve affords protection from pathology is not
understood. The concept of cognitive reserve only applies when considering variability in cognitive
functioning (i.e., memory) in the face of changes in brain integrity (i.e., hippocampal volume).
This raises one of the puzzling questions surrounding reserve: memory and hippocampal integrity are
intimately related, and the mechanisms underlying the decoupling of structure and function are not
clear. From a strict point of view, the differences in cognitive processing envisioned by the CR model
must also have a physiologic basis, in that the brain must ultimately mediate all cognitive function.
Cognitive reserve implies anatomic variability at the level of brain networks, while brain reserve
implies differences in the quantity of available neural substrate.
Moreover, it has more recently been recognized that life exposures that are associated with reserve
also affect brain structure or brain pathology and not simply cognitive properties. This has been
referred to as brain maintenance. Both exercise and cognitive stimulation regulate factors that
increase neuronal plasticity (such as brain-derived neurotrophic factor) and resistance to cell death.
Finally, there is some evidence to suggest that environmental enrichment might act directly to
prevent or slow the accumulation of AD pathology [35]. All of these considerations lead to the
conclusion that brain maintenance acts to help preserve the brain over time. In this regard we can
consider brain reserve the current state of the brain as shaped by brain maintenance.
In sum, there appears to be growing evidence that the experiences that provide cognitive reserve
may indeed reflect not only a cognitive advantage but a structural advantage as well. Thus, brain
reserve and cognitive reserve concepts are not mutually exclusive, and it is likely that both are
involved in providing reserve against brain damage.
Setting aside the question of brain integrity, and considering cognitive reserve only, we return to the
question of why insult to brain structure does not invariably affect cognition. We have observed that
individuals with higher cognitive reserve (defined using a literacy measure) have less rapid memory
decline over time than those with lower literacy levels [36]. However, the manner in which this
memory advantage is conferred is unknown. Stern and colleagues have described these two
potential neural implementations of cognitive reserve as neural reserve and neural compensation.
The idea behind neural reserve is that there is natural interindividual variability in the brain networks
or cognitive processes that underlie the performance of any task. An individual whose networks are
more efficient, have greater capacity, or are more flexible might be more capable of coping with the
challenges imposed by brain pathology. In contrast, neural compensation refers to the process by
which individuals suffering from brain pathology use brain structures or networks (and thus cognitive
strategies) not normally used by individuals with intact brains in order to compensate for brain
damage. The term compensation is reserved for a situation where it can be demonstrated that the
more impaired group is using a different network than the unimpaired group.
A compensatory reserve mechanism might be less applicable to spatial skills than to verbal memory,
because there may be fewer cognitive routes to reproduce a complex figure or detect a subtle visual
,detail amid a complex scene. However, it is also possible that critical issue is not task specific but,
rather, person specific. That is, based on life experience, one person may have multiple ways of
approaching a spatial task but less flexibility for a verbal task, whereas the opposite pattern may exist
in another individual. If the crux of cognitive reserve is the ability to apply alternative approaches to
accomplish tasks, then the benefit of reserve may be linked directly to the flexibility of the task (and
corresponding skill) itself or to a person’s premorbid cognitive style.
One final question is whether or not deterioration of specific cognitive functions can directly affect
cognitive reserve. Is cognitive reserve itself vulnerable to a particular presentation of disease? Or, is
cognitive reserve a construct that is “immune” to the regional distribution of pathology, independent
of the cognitive abilities that may be affected, functioning universally under a wide variety of lesions?
One perspective, although speculative, of cognitive reserve is that it represents the mental flexibility
to develop alternative strategies in the face of pathology and to fluidly apply such strategies to the
task at hand. For the time being, such studies may offer preliminary evidence either that (1) reserve
is immune to the distribution of pathology or (2) reserve is fundamentally different than the cognitive
skills assessed in these studies
Estimating Cognitive Reserve
A practical question for the clinician is how to account for cognitive reserve in the diagnostic process.
In this section, we review the advantages and disadvantages of several approaches including the
following: (1) measurement of individual characteristics (demographic and lifestyle), (2)
consideration of cumulative life experiences, (3) estimation of intellectual functioning, (4)
implementation of statistical approaches (use of latent or residual variables), and (5) derivation of
brain network patterns.
Individual characteristics
One of the most commonly used methods of characterizing reserve involves quantifying individual
characteristics that have been associated with reduced risk of dementia including education,
occupation, intellectual functioning, leisure activity, and social engagement. The advantage of this
approach is that these variables are relatively easy to acquire and quantify and, at face value, are
generally plausible proxies for reserve. A disadvantage is that these variables may be singular
representations of a multidimensional mechanism such that characterization of education in
isolation, for example, might account for a relatively small proportion of the variance in overall
cognitive reserve. Moreover, these variables are rather agnostic with regard to the source and nature
of cognitive reserve and may confound multiple other factors with “true” reserve (e.g., education
may impart greater knowledge and access to health care which in turn may promote health-related
behaviors and enhance cognitive functioning).
Cumulative Life Experiences
A second approach for characterizing cognitive reserve is one in which multiple or cumulative life
experiences are synthesized to develop a more comprehensive estimation of an individual’s reserve.
The purported benefit of this approach is that it synthesizes numerous experiences, all of which have
been shown through epidemiological work to confer protection against the development of
dementia. The consideration of comprehensive life experiences offers the opportunity to capture a
wide array of factors that may uniquely contribute to reserve, if indeed reserve is created through a
cumulative process.
,It is possible that the summation of experiences within this questionnaire (Lifetime of Experiences
Questionnaire, LEQ) may not be more predictive than any individual variable, and compiling these
experiences may even obscure the effect of the most relevant variable.
Intellectual Functioning
A third and very different means of characterizing reserve is the assessment of intellectual
functioning, typically via a single-word reading test. Word reading measures evaluate an individual’s
ability to pronounce a series of phonologically regular and irregular words ranging in difficulty and
are based on the idea that correct pronunciation of the more difficult items requires previous
exposure to such words. Like vocabulary and fund of information, this ability is generally spared early
in the course of dementia, reflecting its reliance on long-term, crystallized knowledge versus the
more fluid abilities affected early in disease.
The characterization of IQ is believed to offer a thumbnail sketch of an individual’s lifetime
intellectual achievement, highly related to, though not necessarily synonymous with, the concept of
cognitive reserve. An advantage of using IQ to characterize cognitive reserve is that in contrast to an
external exposure variable such as education or occupation, an internal and broadly stable capability
such as IQ is presumably more closely associated with the cognitive and neural representation of
reserve. Unfortunately, a corresponding disadvantage is that IQ scores do change in the course of
disease and therefore can be contaminated by the disease process itself (unlike education or
occupation). Moreover, while reading scores are fairly stable in the very early stages of degenerative
illnesses, they are certainly not valid estimates of premorbid IQ in a language predominant illness,
nor are they valid estimates in nonnative English speakers.
Although the details of these models are beyond the scope of this chapter, the idea is that through
statistical data reduction, we can boil down the overgeneralized concept of reserve into its core
elements and identify those variables that are central to its construct versus those that may be
extraneous. A necessary drawback, however, is that representation of cognitive reserve through
shared variance may not reflect aspects of reserve potentially captured selectively by each unique
variable.
Statistical Approaches
A statistical approach to identifying reserve has recently been proposed by Reed and colleagues by
decomposing the variance of a specific cognitive skill such as episodic memory. Results showed that
residual scores correlated with another measure of reserve (word reading), modified rates of
conversion from mild cognitive impairment to dementia over time, and modified rates of decline in
executive function. Finally, baseline brain status had less of an effect on cognitive decline over time
in individuals with high residual scores than low residual scores.
In addition to providing an operational measure of reserve that is quantitative, continuous, and
specific to the individual, the residual approach to characterizing reserve allows the estimate of
cognitive reserve to change over time. Practically speaking, a primary drawback to using residual
scores is that it is currently not feasible for the clinician to apply such scores on an individual basis.
Brain Network Patterns
A future goal for representing reserve is through an identifiable brain network or series of networks.
Such networks might be derived using functional imaging techniques that capture the neural
signature of cognitive reserve.
,The utility of a brain network for capturing cognitive reserve is multifold. First, to the extent that
reserve truly has a neural signature, the identification of a brain network that “behaves” like
cognitive reserve (e.g., correlates with traditional reserve variables, persists across divergent task
demands, and interacts with task performance in the expected way) would be a more direct way to
measure the construct. Second, a brain network would be a nonbiased characterization of reserve
that could be used universally in a manner that tests such as vocabulary or single-word reading
cannot, due to their influences from culture and language. Third, a brain network is malleable in a
way that fixed life experiences are not and thus lends itself to examination in the context of a
longitudinal study. For example, interventional studies aimed at increasing reserve could use a brain
network to measure reserve both pre- and postintervention, and unlike cognitive testing, this
network would be resistant to practice effects.
Application of Cognitive Reserve in Clinical Practice
When assessing cognition as part of a diagnostic evaluation, it is important to take into account the
most appropriate and valid indicator of cognitive reserve for a given patient. In the event that an
individual’s level of education is not believed to be a good representation of his or her optimal
cognitive functioning, assessment of IQ or consideration of occupation may provide a more accurate
estimate.
Integration of the most appropriate and valid measure of cognitive reserve into the diagnostic
formulation is critical. Individuals with high reserve, by definition, will not demonstrate clinical
symptoms as early as individuals with low levels of reserve. On the one hand, this issue could
partially be a problem with instrumentation, such that (1) more challenging tests with higher ceilings
may better detect changes in individuals with very high levels of functioning, (2) tests that are more
pathologically specific (e.g., associative learning tasks for the hippocampus) may have greater
sensitivity in high reserve individuals, or (3) better normative data may allow for better detection of
impairment in individuals with high levels of intellectual functioning. Indeed, quantitative
consideration of IQ scores appears to improve the sensitivity of cognitive testing for detecting
pathology.
In theory, there would still be a period of time during which even the most sensitive measures would
fail to detect change in those with high reserve given the apparent “lag” between pathological
changes and their cognitive sequelae. Therefore, from a clinical standpoint, neuropsychological
testing will be less sensitive to the presence of early pathology in those with high reserve even when
we consider current test scores in the context of a person’s optimal level of functioning (e.g., IQ,
education). As such, the only action to be taken by clinicians is to be aware of this conundrum and to
appreciate that intact cognition in individuals with high levels of reserve does not preclude the
presence of disease.
The standard and generally useful approach taken by neuropsychologists is to formally adjust
cognitive scores for education, a procedure which, in theory, allows for the interpretation of current
cognitive performance in the context of an individual’s expected performance.
Information regarding brain integrity should be integrated with cognitive data for diagnostic
purposes, whenever possible. Neuroimaging tools have the potential, particularly in individuals with
high reserve who maintain cognitive functioning for an extended period of time, to detect
pathological changes when impairment on neuropsychological testing is absent or subtle.
,More recently, the sensitivity of a variety of imaging tools for detecting pathological changes prior to
cognitive change has been demonstrated on structural MRI [69] and functional MRI (fMRI), as well as
through examination of activity level in the default network on resting fMRI. Moving forward, in vivo
amyloid imaging, although not currently used in clinical practice, will certainly play an important role
in identifying neuropathological changes in asymptomatic individuals as the field moves toward
earlier identification of disease. While these various technologies enable the consideration of
cognitive reserve as a factor influencing the clinical presentation and diagnosis of a patient, a current
challenge to integrating imaging information is applying results from group studies to individual
patients.
Another recommendation for applying the concept of cognitive reserve to clinical practice is to
consider it as a factor that will influence rate of cognitive decline following diagnosis. Decline is more
rapid in individuals with high reserve than those with low reserve. This counterintuitive acceleration
in rate of change is believed to reflect the increasingly high pathological burden that the brain can no
longer tolerate. Certainly, this has practical implications for the patient, family, and health-care
providers. It may also have direct relevance for the effectiveness of treatment.
Cognitive reserve may influence an individual’s response to treatment with currently available
medications as well as future drug therapies. The treatment of degenerative diseases such as
Alzheimer’s disease is certain to be most effective when done preventatively, when the burden of
pathology in the brain is very low or absent altogether. Thus, in order to develop reasonable
expectations about a medication’s effectiveness, it will be important to have knowledge of three
variables: cognitive performance, cognitive reserve, and pathological burden.
A final insight for clinicians is that while a wide range of evidence exists from epidemiological studies
linking certain life experiences and individual characteristics to lower rates of dementia, this evidence
is not sufficient to determine definitively whether or not such experiences directly prevent or delay
dementia. While recommending that patients engage in certain activities such as mental enrichment
and physical fitness is likely not to be harmful and may in fact have numerous positive effects,
clinicians should be careful not to present these activities as established treatments or fully proven
preventative strategies against dementia.
Clinical Pearls/Summary
- When formulating clinical impressions, apply the most appropriate and valid indicator of
cognitive reserve for each individual patient. This may be an individual characteristic such as
level of education; a representation of cumulative life experiences spanning social, academic,
occupational, and leisure activities; or a measure of intellectual functioning. Moving forward,
statistically and neuroanatomically derived measures of cognitive reserve may also become
valuable for clinical purposes.
- Integrate neuroimaging tools to complement cognitive data for diagnostic purposes.
- Consider cognitive reserve as a factor that may affect rate of decline. The apparent yet
counterintuitive acceleration of decline associated cognitive reserve may reflect a state of
increasingly high pathological burden that the brain can no longer tolerate.
- Appreciate that cognitive reserve may be a factor that influences response to treatment.
- Be aware that epidemiological studies linking life experiences to reduced dementia risk are
observational, and intervention studies are needed to determine definitively if specific
experiences and activities enhance reserve and lower dementia risk.
,WEEK 3
Chapter 28 Differentiating Mild Cognitive Impairment and Cognitive Changes of Normal
Age – Caterina B. Mosti, Lauren A. Rog, And Joseph W. Fink
Introduction
Normal Cognitive Aging
As people live longer, scientists are given greater opportunity to improve their knowledge of the
structure and function of the aging brain. The oldest old (85 years and older) is the fastest-growing
segment of the population. Given these statistics, there is a great need for clinical services and
research focusing on normal and pathological cognitive aging.
It is generally accepted that some degree of cognitive decline associated with aging is inevitable, with
a great deal of variability as to when these changes begin. Interindividual variation in cognitive
performance in areas such as memory and fluid intelligence increases with age. Thus, with advancing
age, there becomes an increase in the proportion of elderly persons who show normative age-
associated cognitive decline. It can become difficult to parse out “normal” cognitive aging versus
pathological cognitive decline in the absence of neuropsychological testing with normative
comparison data.
Some aspects of cognition remain relatively intact with normal aging, including implicit memory,
vocabulary, and storage of general knowledge. The cognitive decline that typically accompanies
normal cognitive aging involves decreased efficiency in information processing in several areas,
including speed of processing, reaction time, working memory capacity, short-term memory,
executive control (e.g., inhibitory functions), and verbal fluency. Visuoperception, visuoconstruction,
and spatial orientation also decline with age.
Slowed processing speed is a key cognitive change in the aging brain. The consequences of reduced
processing include decreased working memory capacity because less information can be processed
within a given time, as well as impaired higher-order cognitive functions such as abstraction or
elaboration, because the relevant information is no longer available in working memory or storage.
Age-related changes in working memory are likely due to reduced inhibitory mechanisms of selective
attention. That is, older adults show decreased ability to effectively suppress the processing of
irrelevant, or marginally relevant, stimuli and thoughts. Cognitive aging is also associated with poorer
effortful or controlled processing, while automatic processing remains relatively intact. Older adults
retain relatively good memory for “gist” or familiar stimuli, while source memory and recollection of
contextual details decline.
Normal age-related changes in language function include increased inefficiency in phonological
retrieval, resulting in word-finding difficulties that are often referred to as the “tip of the tongue”
phenomenon. It is suspected that the age-related decline in verbal fluency is at least partly due to
the substantial contributions of auditory attentionand verbal memory abilities to the tasks, rather
than simply a primary degradation of semantic or lexical networks.
Structural Brain Changes
Numerous changes in brain structure accompany normal aging, including volumetric shrinkage,
decreased white matter density, loss of dopaminergic receptors, and the emergence of
,neurofibrillary plaques and tangles. The greatest degree of cortical thinning and volumetric brain
shrinkage across the lifespan occurs in the hippocampus, caudate, cerebellum, and calcarine (i.e.,
occipital) and prefrontal areas. Ventricular volume also increases in old age. Decreases in white
matter density and other white matter abnormalities are particularly evident in the frontal and
occipital regions of the brain. White matter changes may be the primary culprit for age-related
cognitive slowing, as white matter’s main function is to facilitate transmission of signals to and from
different areas of the brain via myelinated axons. As myelin integrity degrades with age, so does the
speed of cognitive processing.
Loss of dopaminergic receptors occurs with age and is thought to contribute to the attentional
dysregulation, executive dysfunction, and difficulty with contextual processing that accompanies
normal cognitive aging. The mechanism of context processing subserves cognitive functions such as
attention, working memory, and inhibition by affecting the selection, maintenance, and suppression
of information relevant (or irrelevant) to the task, accounting for the decline in these abilities with
age.
An autopsy study on clinically nondemented oldest old (age ≥ 85 at death; n = 9) found neurofibrillary
tangles (NFTs) in one or more limbic regions in all study participants. The most affected regions
included the entorhinal cortex, amygdala, subiculum, CA1 field of the hippocampus, and inferior
temporal regions. Midfrontal, orbitofrontal, and parietal regions were less affected, and occipital
regions were minimally affected in clinically nondemented persons. Senile plaque (SP) formation also
was observed in this group and was found to affect all brain regions equally, with the exception of
relative sparing of the occipital cortex. Two of nine participants who were nondemented in the few
months prior to death met pathological criteria for Alzheimer’s disease, suggesting individual
variability in the relationship between brain pathology and cognitive presentation.
Functional imaging techniques such as positron emission tomography (PET) and functional magnetic
resonance imaging (fMRI) allow for the examination of blood flow and oxygenation to particular
brain structures, in participants as they engage in cognitive tasks. Comparisons of older and younger
adults reveal an increase in bilateral activation with age, whereby tasks associated with focal,
unilateral activation in younger adults (e.g., verbal memory) become associated with bilateral
activation in older adults. Further, bilateral activation in older adults is associated with better
performance on cognitive tasks, including working memory, semantic learning, and perception. This
suggests that the older brain engages in more widely distributed compensatory processing by
activating the contralateral hemisphere to achieve greater cognitive benefits.
Theories of Aging
In a process termed “dedifferentiation,” sensory function (i.e., visual acuity and audition) has been
shown to predict performance on a wide range of cognitive tasks in older, but not younger, adults. t
has been proposed that abilities that are relatively independent earlier in life, such as sensory ability
and cognition, become more interrelated with old age. Functionally, this can be thought of as a
decrease in neural specificity, whereby regions that respond selectively in younger adults change to
respond to a wider array of inputs in older adults. Similarly, in older adults, increased prefrontal
activation is associated with decreased parahippocampal activation and hippocampal volume
shrinkage.
Salthouse proposed the processing-speed theory of cognitive aging. which assumes that a wide
range of cognitive task performances are limited by the imposed constraints on the speed of
,processing. Slow processing speed dampens cognition in two ways: (1) cognitive operations are
executed too slowly to be successfully completed in the available time and (2) the amount of
simultaneously available information, necessary for higher-level processing, is reduced, as early
processing is no longer available when new processing occurs. Complex operations are most affected
by slow processing speed since they are dependent on the products of simpler (and earlier)
operations.
The scaffolding theory of aging and cognition proposes that structural brain changes associated with
aging are accompanied by effort on the part of neural networks to maintain homeostatic cognitive
functioning in the face of these changes. This leads to changes in brain function through
“strengthening of existing connections, formation of new connections, and disuse of connections
that have become weak or faulty”. The initially engaged neural networks shift from broad and
dispersed to a specific and honed circuit of neural regions. While the more specific regions assume
dominant control over functions, the initial broad networks continue to be minimally active,
suggesting that they remain available for compensatory processing. In the aging brain, scaffolding is
thought to maintain healthy cognitive function in the face of neural degradation. The need for
compensatory scaffolding exceeds the available networks, resulting in a more profound decline in
functioning in the oldest old.
Individual Factors in Cognitive Aging
Factors shown to contribute to cognitive reserve or to be related to cognitive decline in clinical
studies include education, occupational complexity, physical health, and diet. It is suspected that
cognitive reserve is represented biologically by a number of processes, including (1) richer
interconnectivity and organization of neural circuits; (2) alterations in synaptic efficiency, marked by
changes in neurotransmitter release, receptor density, and receptor affinity; (3) and changes in
intracellular signaling pathways.
Physical health status is arguably one of the more important factors to consider when predicting
performances on cognitive assessment in noncognitively impaired elderly.
Higher education has been associated with preserved cognitive performance over time (i.e., less
decline) in aging adults. Occupational complexity is shown to be related to relatively better cognitive
functioning with age, above and beyond the benefits afforded by higher levels of education.
Participants who held jobs with high complexity of work with people demonstrated better cognitive
performance on measures of verbal skills, spatial skills, and processing speed than participants with
low occupational complexity with people..
Mild Cognitive Impairment
Defining Mild Cognitive Impairment
Neuropsychological referrals are often made on the basis of a patient’s or their family’s perceived
(i.e., subjective) report of a decline in cognitive ability. The construct of MCI (Mild Cognitive
Impairment) represents a decline in cognitive performance greater than would be expected for the
person’s age but not sufficient to meet criteria for a diagnosis of dementia [58]. Petersen described
MCI as interposed between normal cognitive changes associated with aging and the very early stages
of a dementing process.
, The original criteria for MCI proposed by Petersen et al. are as follows:
1. Presence of a memory complaint
2. Normal activities of daily living
3. Normal general cognitive function
4. Abnormal memory for age
5. Not demented
These criteria are particularly useful for patients who have impairment in the memory domain but
intact cognitive performance and functioning in all other domains. Such patients would be labeled as
having amnesic MCI (a-MCI).
The diagnostic criteria for MCI in a clinical setting are as follows:
1. Concern regarding change in cognition: There is evidence of concern for change in the
patient’s cognitive status as compared to his/her previous level.
2. Impairment in one or more cognitive domains: There is evidence of lower performance in
one or more cognitive domains that is greater than what would be expected for the patient’s
age and educational background.
3. Preservation of independence in functional abilities: The patient generally maintains his/her
independence of function in daily life without considerable aids or assistance.
4. Not demented: These cognitive changes are sufficiently mild so that there is no evidence of
significant impairment in social or occupational functioning.
Subtypes
We have already mentioned single-domain amnesic MCI (a-MCI), which is a useful category for
patients who have impairment in memory but intact cognitive performance in all other domains and
in daily functioning. Some patients display impairment in a single nonmemory cognitive domain (e.g.,
executive function) but perform normally in other domains, including memory. These patients would
be given labels of single-domain non-amnesic MCI (na-MCI). Still other patients present with
impairments in multiple domains while continuing to display relatively intact activities of daily living
(ADLs) and general cognitive functioning; these patients would be classified generally as having
multiple-domain MCI. More specifically, in the event that a deficit in memory is present, a patient is
given a diagnosis of multiple-domain MCI with amnesia (md-MCI + a); if memory impairment is not
evident, then a diagnosis of multiple-domain MCI without amnesia (md-MCI-a) is appropriate.
Aetiology and prognosis
In addition to different subtypes, there also are multiple etiologies for MCI. Petersen suggested four
main etiologies: (1) degenerative (e.g., Alzheimer’s disease), (2) vascular (e.g., cerebrovascular
disease), (3) psychiatric (e.g., depression), and (4) traumatic (e.g., head injury). Particular subtypes of
MCI are reported to be more commonly associated with certain etiologies.
Follow-up data from the initial Petersen et al. study on MCI using patients (N = 220) from the Mayo
Alzheimer’s Disease Center/Alzheimer’s Disease Patient Registry (ADC/ADPR) demonstrated a rate of
progression from MCI to dementia of 12% per year. At the same time, however, many persons with
MCI remain stable with this diagnosis or revert to normal. These data suggest that for some patients,
MCI represents an intermediate point on the continuum from normal cognition to dementia, while
for others, MCI is a transient period of cognitive decline that resolves with time. Those with na-MCI
are most likely to revert to normal or improve their cognitive status over time.