100% satisfaction guarantee Immediately available after payment Both online and in PDF No strings attached
logo-home
Week 2 Review $2.99
Add to cart

Class notes

Week 2 Review

 3 views  0 purchase

Lecture notes from Week 2

Preview 2 out of 6  pages

  • December 8, 2024
  • 6
  • 2024/2025
  • Class notes
  • Hosseinzadeh taher
  • All classes
All documents for this subject (8)
avatar-seller
4point0
Week 2 Review
Perceptrons

A neuron’s output is the dot product of the connection weights and the inputs

product must exceed a threshold for the neuron to ‘fire’

bias = activation threshold

There is a bias parameter for each neuron

There is a weight parameter for every neuron connection

linear perceptrons can only generate a zero or a one output (not differentiable)

an activation function allows us to have a gradual transition between zero and
one for the output (continuous). This is necessary for differentiation. Added to
a perceptron to provide non-linearity.

Without nonlinear activations, any number of linear combination layers
could be performed as a single linear combination. There would be no
point in making a sequence to perform.

Sigmoid activation function ϕ(z) = 1/(1 + exp(−z))
A forward pass is deterministic (the same input will always produce the same
output)

multi layer perceptrons

A neural network with 2 layers can learn all continuous functions, and 3
layers can learn all functions

A = ϕ(W X + b)
A= activations (model output)
W = weight matrix (There is a weight parameter for every neuron
connection)

X = inputs
b= vector of biases (There is a bias parameter for every neuron)


Week 2 Review 1

, In the final layer of a neural network, the output of each neuron’s activation
function is equal to the output of the network.

Back Propagation

The chain rule calculates the partial derivative of the loss with respect to
any parameter in the network.

We can calculate the partial derivative at any neuron even though we only
have information about the final output layer

The main goal of BP is to optimize the weights and biases in the network

The algorithm needs a dataset of inputs and target outputs to calculate a
gradient.

These are necessary to generate network output and calculate the
discrepancy via the loss function. Backpropagation will calculate the
gradient of this loss function with respect to the model’s parameters.

The formula for calculating the partial derivative of the loss function with
respect to any parameter in the network changes based on he depth of the
parameter.

A different equation is used for parameters of connections for the
output layer.

A forward propagation is done before every back propagation.

Steps in Back propagation

1) Calculate error at final layer

E = 12 (y − y^)2 
​ ​




2) Use the chain rule to calculate how
to update the layer prior
∂f ∂f ∂g ∂h ∂i
∂x
​ = ∂g ∂h ∂i ∂x
​ ​  ​ ​




error derivatives tell us how to
change the weights

3) Repeat step two for all previous
layers


Week 2 Review 2

The benefits of buying summaries with Stuvia:

Guaranteed quality through customer reviews

Guaranteed quality through customer reviews

Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.

Quick and easy check-out

Quick and easy check-out

You can quickly pay through credit card or Stuvia-credit for the summaries. There is no membership needed.

Focus on what matters

Focus on what matters

Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!

Frequently asked questions

What do I get when I buy this document?

You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.

Satisfaction guarantee: how does it work?

Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.

Who am I buying these notes from?

Stuvia is a marketplace, so you are not buying this document from us, but from seller 4point0. Stuvia facilitates payment to the seller.

Will I be stuck with a subscription?

No, you only buy these notes for $2.99. You're not tied to anything after your purchase.

Can Stuvia be trusted?

4.6 stars on Google & Trustpilot (+1000 reviews)

49497 documents were sold in the last 30 days

Founded in 2010, the go-to place to buy study notes for 14 years now

Start selling
$2.99
  • (0)
Add to cart
Added