Collegeaantekeningen leren en geheugen (5102LEGE9Y) Week 7
5 views 0 purchase
Course
Leren en geheugen (5102LEGE9Y)
Institution
Universiteit Van Amsterdam (UvA)
Book
Principles of Neural Science, Sixth Edition
Een uitgebreide en overzichtelijke samenvatting van de hoorcolleges uit week 7 voor deeltentamen 2 van het vak Leren en Geheugen van de studie Psychobiologie aan de UvA ().
Principles of Neural Science, 6th edition Eric Kandel - 77 Q & A from all chapters - Level: advanced
Neurons and synapses
Full summary of case 12 How do we learn and memorize? - Just forget about it.
All for this textbook (11)
Written for
Universiteit van Amsterdam (UvA)
Psychobiologie
Leren en geheugen (5102LEGE9Y)
All documents for this subject (22)
Seller
Follow
Annabel2703
Content preview
HC aantekeningen Week 7
Quizlet link: https://quizlet.com/Annabel2703/folders/leren-en-geheugen/sets
Inhoud
College 17: Computational models of learning and memory (part II) .............................................................................. 2
College 18: De ziekte van Alzheimer en dementie ......................................................................................................... 15
College 19: Geheugen consolidatie & reorganisatie ....................................................................................................... 32
,College 17: Computational models of learning and memory (part II)
➢ Synapsen belangrijk voor leren, dus dat moet je goed kunnen simuleren
❖ Recap:
➢ basic ideen voor modelling, hoe
kunnen we individuele neuronen
modelleren op verschillende
niveaus?
➢ We covered 3 models, 3 levels:
Hodgkin-Hodgkin-Huxley (most
biophysical, currents), Izhikevich
(little more simplified), Leaky
integrate-and-fire model (most
simple but useful for basic
features, used during workshop).
❖ How are models of learning implemented
in these neural network models?
Models of learning and memory
❖ Now that we can simulate a neuron, we can start looking at learning rules
❖ We can classify the learning strategies of the brain in three main categories, which can also be used in
artificial approaches of learning.
➢ Unsupervised learning: We can classify the learning strategies of the brain in three main
categories,
▪ input from outside world -> activity between input and layer x is the main thing that
changes the output?
▪ nothing that oversees how the network learns and performs, free, unsupervised
▪
➢ Supervised learning: the neural network receives input from the outside world and also the
desired output, so that the network can change its synaptic weights to reach such output.
▪ input en output vergelijken -> weights aanpassen
▪
➢ Reinforcement learning: the neural network receives input from the outside world, and a
reward/punishment teaching signal which bias the learning towards a desired output.
▪ extension van supervised learning, teaching signal. Het signaal is niet de desired outpu
die we willen, niet een erg specifiek signaal maar een beloning/straf.
, ▪
▪ (Ik vraag me af: krijg je dus als netwerk alleen te horen ‘je deed het verkeerd’ of krijg je ook een richting
aangegeven zoals ‘je uitkomst was te laag’, of is dat meer supervised learning?)
❖ Biological examples:
➢ Unsupervised learning: for example, receptive fields.
➢ Supervised learning: links with biological mechanisms still unclear. A good candidate is
learning in the cerebellum (teaching signals).
➢ Reinforcement learning: classical conditioning.
Unsupervised learning
❖ Unsupervised learning is a learning process in which synaptic weights change as a function of the
inputs (and internal activity) only.
❖ Simplicity and plasticity -> good for experiments, computational pov
❖ It is therefore easy to map this process to the learning of biological neural systems and changes in
biological synapses.
❖ The first biological principle of synaptic changes associated with learning is the Hebb’s principle:
“Neurons that fire together, wire together.
❖ “Neurons that fire together, wire together”- Donald Hebb
➢ ->
➢ -> ->
➢ ->
➢ ->
➢ Synaptisch weight increases
, ➢ “WHEEL” reminds us of the car
➢ Activation of neuron -> activation of neuron connected by stronger synapse
❖ This principle allows to recover neural activity patterns, or neural assemblies, from incomplete or
noisy data, leading to the concept of associative memory.
➢ This happens without any kind of supervision from external agents.
❖
❖ We can consider a variety of learning rules to train neural networks in an unsupervised way.
➢ Some of these rules come from biology (i.e. refined versions of the Hebb rule, or other different
rules also found in synapses).
➢ Other rules can be considered on the basis of their theoretical and computational properties
(such as stability, simplicity or fast training times).
➢ We will cover several classical learning rules used in unsupervised learning
❖ thepricial principes for guidelines
❖ The BCM rule:
➢ formulated by Elie Bienenstock, Leon Cooper and Paul Munro in 1982. It attempts to explain
learning in the visual system.
➢ This rule is an extension of the Hebb rule (but for continuous values) which solves two
important aspects of the stability problem of the Hebb rule. (Hebb would make the synapses
either stronger and stronger, or weaker and weaker, but we want something more stable)
➢ More precisely, the BCM rule adds
▪ (i) a leaky term to incorporate depression and make unused synapses weaker,
▪ and (ii) a sliding threshold to balance potentiation with depression and prevent
runaway increase of synaptic weights
➢ Equation:
▪
▪ temporal evolution of wij: synaptic weight between neurons i
and j
▪ with φ(x) is the sigmoidal function, which imposes a cap in the increase of the synaptic
weight.
▪ This function introduces a sliding threshold (𝜃!), which provides the stability factor
missing in the standard Hebb rule.
▪ The leaky term provides a long-term depression mechanism
The benefits of buying summaries with Stuvia:
Guaranteed quality through customer reviews
Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.
Quick and easy check-out
You can quickly pay through credit card or Stuvia-credit for the summaries. There is no membership needed.
Focus on what matters
Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!
Frequently asked questions
What do I get when I buy this document?
You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.
Satisfaction guarantee: how does it work?
Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.
Who am I buying these notes from?
Stuvia is a marketplace, so you are not buying this document from us, but from seller Annabel2703. Stuvia facilitates payment to the seller.
Will I be stuck with a subscription?
No, you only buy these notes for $5.40. You're not tied to anything after your purchase.