Theme 3: Memory
Neath, I., & Surprenant, A. M. (2003). Chapter 5: Perspectives on processing (only pp. 96-111)
In the chapters before the emphasis of memory was on how the items were stored - now we look at how different kinds of
processing can influence memory
• Levels of processing
Craik and Lockhart made 4 assumptions about how the type of processing is more important for memory than the
underlying theoretical structure.
o They conceptualised memory as the result of a series of analyses, each at a deeper level than the previous one, that are
performed on the to-be-processed information (like a continuum instead of a discrete level - from shallow to deep)
o The deeper the level, the more durable the resulting memory (so, focussing on the meaning of an item means it is
remembered better)
o The levels of processing view assumes that rehearsal can be relatively unimportant, it will only be beneficial if it
induces a deeper level of processing (type II rehearsal, elaborative rehearsal).
o Research on memory will be most informative when the experimenter has control over the processing - they should use
an incidental learning procedure, so that the participant is unaware that the material is being processed and will be
tested later on. Otherwise, subjects will use processing they consider the most appropriate, which may not be the
processing that the experimenter wants.
Other factors might be more important than intent to remember, like pleasantness. Craik and Watkins also found that
recall of an item was uncorrelated with how long it has been rehearsed using maintenance rehearsal. This could be
explained by the modal model, since it would be in the short-term store.
Problems with the levels of processing model:
o One of its key assumptions is circular - deeper levels of processing lead to better memory, and when there is better
memory, we attribute it to a deeper level of processing. There is no independent method for determining whether
process A is deeper or shallower than process B, before taking the experiment.
o Levels of processing focuses almost exclusively on encoding and says little about retrieval. This is why a second view
was developed; the transfer appropriate processing.
• Transfer appropriate processing
The difference between transfer appropriate processing and levels of processing is that the first one includes retrieval. A
process leads to better performance not because it is deeper but it is appropriate for the kind of test that will be
conducted. Generally, deep tasks lead to better performance but free recall is not the only way of testing memory. In
some tasks, shallow processing can lead to better performance (rhyme recognition task for example).
According to Fisher and Craik, the reason for this is that semantic coding leads to a more specific cue, which in turn leads
to enhanced discriminability from competing cues. This is related to Watkins cue overload principle, which states that only
so many items can be associated with a cue before the cue begins to lose its effectiveness - the more items are associated
with a cue, the less effective the cue will be in eliciting the desired item.
• Organisation and distinctiveness
The levels of processing framework includes the idea that a deeper level of processing produces a more distinctive cue
than does a shallow level of processing. The more distinct an item - the more different it is from competing items. Other
literature demonstrates that organisation helps memory, and when incorporating information into an organisation,
similarity matters most. So, both similarity and difference can be beneficial to memory.
In an organisation, relationships among the information that is to be remembered and form a cornerstone of the Gestalt
theory. The same goes for clustering.
, Both are important because both reflect particular types of processing (differences between relational and item-specific
processing)
• The encoding specificity principle
The recollection of an event or a certain aspect of it depends on the interaction between the properties of the encoded
event and the properties of the retrieval information. The encoding specificity principle relates to the transfer appropriate
processing since they both emphasise the interaction between the processing that occurred during encoding and the
processing that occurs at retrieval. Weak study cues can be very effective when the retrieval cue is weak as well.
An item is not necessarily the best cue for itself, and it is not the match between encoding and retrieval cue but
appropriateness of processing (this is however, often similar processing)
• Context and memory
Context effects can be predicted by the encoding specificity principle (or by any view of memory that emphasised
processing), like in one classroom students did better than in another classroom. Context can be defined in many
different ways. One way to distinguish them is by context alpha (environmental surrounds in which some events exists or
occurs) and context beta (the situation in which one stimulus event combines with another stimulus event).
Context-dependent memory effect refers to the observation that memory can be worse when tested in a new or
different context relative to performance in a condition in which the context remains the same.
State-dependent memory means that a person's state can be changed by altering the affective or pharmacological state
from study to test (sober versus intoxicated). Performance is worse when switching from the unusual to normal condition
than when switching from the normal to unusual condition, regardless of which condition (sober or intoxicated) is normal.
Context-, mood- or state-dependent memory effects are larger (1) with tests such as free recall that require internally
generated cues than with tests like cued recall or recognition that also have externally provided cues, and (2) when tests
emphasises conceptually driven-tasks (meaning and relationships) rather than data-driven (perceptual processing) tasks.
In an experiment of Eich and Metcalfe, subjects recalled more words when their moods matched than when the moods
were mismatched, regardless of whether they read of generated items before. For generated words, recall was superior
and the mood congruency effect was larger. This supports the idea that to the extent that thoughts are important in the
processing of information, there will be large effects of environment, state and mood congruency. If thoughts are
relatively unimportant, then context effects will be difficult to obtain.
Context and state-dependent memory can be seen as examples of transfer appropriate processing. There is an important
distinction between mood-dependent memory and mood-congruent effects; mood congruency refers to the finding that
a given mood tends to cue memories that are consistent with that mood, rather than memories that are incongruent. In
this way, mood serves as a cue.
Mood-dependent memory: if you're in the same mood in the study and test, you remember more than when you're in
different moods (regardless of which mood)
, Neath, I., & Surprenant, A. M. (2003). Chapter 6: Forgetting (only the part on ‘Decay versus interference in
immediate memory’, pp. 125-129)
• Decay versus interference in Immediate memory
The Brown-Peterson paradigm is one area in which both a decay explanation and one based on interference were
offered. Subjects are given three letters and are then asked to count backward (or another distracting activity) for a
specified amount of time. The findings are usually that after 20 seconds of counting backwards, very few people can recall
the letters. It can be explained as evidence for a separate short-term store and for advocating a major role for decay to
explain the loss of information. There has been lots of research on it but it has not been very promising for the decay
explanation.
o Decay explanation - The decay explanation says that the more time that elapses, the more decay will occur. While it
seemed that performance was poorer and poorer with each trial, it could also be proactive interference, when
consonant trigrams on the trials are similar. Peterson and Peterson were aware of this and tested whether this was the
case, but they found no proactive interference (PI), since performance increased. Keppel and Underwood however, did
find a proactive interference effect, since they looked at the first 4 trials separately instead of averaging over 12, and
they found that performance actually got quite poor after 3 trials.
o Proactive interference will build up when the list items are all highly similar. If the stimulus type is changed however,
there should be less PI. So for example, if the distracting task does not consist of consonant but of numbers, it is seen
that the switched group was much better than the control group. So, the PI account predicts that performance can
increase, while the decay explanation cannot.
The amount of reduction (or release from PI) can be calculated.
X is the difference in performance between the experimental and control groups on trial 4 and y is the difference
between the control group on trial 4 and the control group on trial 1 - the percentage release from PI is then equal to x/y
*100. if the release is 100%, performance on the fourth trial is the same as the performance on the first trial, if it is close
to 0, there is no release.
This is release of proactive interference!!! ^ dit is een kwestie van retrieval
Most releases are symmetric - the amount of release is the same when switching from consonants to numbers as it is
when switching from numbers to consonants. This means that it is not the case that one type of material is easier to
recall, but that the switch is crucial.
The semantic differential, is when all the words on the first three trials have similar meanings (like something pleasant)
and the words on the last trial mean something different (sick, kill), and there is about 55% release from PI. Not only
laboratory, also with news categories there can be a 70% PI release.
The most promising idea for the locus of the release form the PI is by explaining it as a retrieval effect.