100% satisfaction guarantee Immediately available after payment Both online and in PDF No strings attached
logo-home
Summary Learning Partial Exam 2 $5.34
Add to cart

Summary

Summary Learning Partial Exam 2

 20 views  2 purchases
  • Course
  • Institution

This is a summary of the second partial exam of the course Learning at the University of Amsterdam. The summary is in order of the lectures.

Preview 2 out of 10  pages

  • April 18, 2020
  • 10
  • 2016/2017
  • Summary
avatar-seller
Leren Samenvatting 2
Decision Trees
Decision tree representation:
 Each internal node tests an attribute
 Each branch corresponds to an attribute value
 Each leaf node assigns a classification

Top-Down Induction of Decision trees, main loop:
1. A  the “best” decision attribute for the next node
2. Assign A as decision attribute for a node
3. For each value of A, create a new descendant of the node
4. Sort training examples to leaf nodes
5. If the training examples are perfectly classified, then STOP, else iterate over new leaf
nodes

Entropy
Entropy(S) is the expected number of bits needed to encode a class (+ or -) of a randomly
drawn member of S (under the optimal, shortest-length code). Entropy is the degree of
uncertainty. Binary variance = p(1-p)

, Information Gain
Gain(S,A) is the expected reduction in entropy due to sorting on A.




The information gain is higher for the classifier Humidity, so that is the best classifier.

ID3 Algorithm
There is noise in the data, so we need to make sure that that isn’t used in the model, because
it couldn’t generalize if that were the case.
 Preference for short trees, and for those trees with high information gain attributes
near the root.
 Bias is a preference for some hypotheses, rather than a restriction of hypothesis space
H.
 Occam’s razor: prefer the shortest hypothesis that fits the data:
o Arguments in favor:
 Fewer short hypotheses, than long hypotheses
 A short hypothesis that fits data is unlikely to be a coincidence
 A long hypothesis that fits data might be a coincidence
o Arguments opposed:
 There are many ways to define small sets of hypotheses
 E.g. all trees with a prime number of nodes that use attributes
beginning “Z”

The benefits of buying summaries with Stuvia:

Guaranteed quality through customer reviews

Guaranteed quality through customer reviews

Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.

Quick and easy check-out

Quick and easy check-out

You can quickly pay through credit card or Stuvia-credit for the summaries. There is no membership needed.

Focus on what matters

Focus on what matters

Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!

Frequently asked questions

What do I get when I buy this document?

You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.

Satisfaction guarantee: how does it work?

Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.

Who am I buying these notes from?

Stuvia is a marketplace, so you are not buying this document from us, but from seller kimgouweleeuw. Stuvia facilitates payment to the seller.

Will I be stuck with a subscription?

No, you only buy these notes for $5.34. You're not tied to anything after your purchase.

Can Stuvia be trusted?

4.6 stars on Google & Trustpilot (+1000 reviews)

53022 documents were sold in the last 30 days

Founded in 2010, the go-to place to buy study notes for 14 years now

Start selling
$5.34  2x  sold
  • (0)
Add to cart
Added