100% satisfaction guarantee Immediately available after payment Both online and in PDF No strings attached
logo-home
CS 189 Introduction to Machine Learning HomeWork 3 University of California, Berkeley COMPSCI 189 $10.99   Add to cart

Exam (elaborations)

CS 189 Introduction to Machine Learning HomeWork 3 University of California, Berkeley COMPSCI 189

 14 views  0 purchase
  • Course
  • Institution

CS 189 Introduction to Machine Learning Spring 2019 Jonathan Shewchuk HW3 Due: Wednesday, February 27 at 11:59 pm Deliverables: 1. Submit your predictions for the test sets to Kaggle as early as possible. Include your Kaggle scores in your write-up (see below). The Kaggle competition for this ...

[Show more]

Preview 3 out of 18  pages

  • February 21, 2023
  • 18
  • 2022/2023
  • Exam (elaborations)
  • Questions & answers
avatar-seller
CS 189 Introduction to Machine Learning
Spring 2019 Jonathan Shewchuk HW3
Due: Wednesday, February 27 at 11:59 pm
Deliverables:


1. Submit your predictions for the test sets to Kaggle as early as possible. Include your Kaggle
scores in your write-up (see below). The Kaggle competition for this assignment can be
found at:

• https://www.kaggle.com/c/cs189-hw3-mnist
• https://www.kaggle.com/c/cs189-hw3-spam
2. Submit a PDF of your homework, with an appendix listing all your code, to the Grade-
scope assignment entitled “HW3 Write-Up”. You may typeset your homework in LaTeX or
Word (submit PDF format, not .doc/.docx format) or submit neatly handwritten and scanned
solutions. Please start each question on a new page. If there are graphs, include those
graphs in the correct sections. Do not put them in an appendix. We need each solution to be
self-contained on pages of its own.
• In your write-up, please state with whom you worked on the homework.
• In your write-up, please copy the following statement and sign your signature next to it.
(Mac Preview and FoxIt PDF Reader, among others, have tools to let you sign a PDF
file.) We want to make it extra clear so that no one inadverdently cheats. “I certify
that all solutions are entirely in my own words and that I have not looked at another
student’s solutions. I have given credit to all external sources I consulted.”
3. Submit all the code needed to reproduce your results to the Gradescope assignment entitled
“HW3 Code”. Yes, you must submit your code twice: once in your PDF write-up (above) so
the readers can easily read it, and once in compilable/interpretable form so the readers can
easily run it. Do NOT include any data files we provided. Please include a short file named
README listing your name, student ID, and instructions on how to reproduce your results.
Please take care that your code doesn’t take up inordinate amounts of time or memory. If
your code cannot be executed, your solution cannot be verified.

4. The assignment covers concepts on Gaussian distributions and classifiers. Some of the ma-
terial may not have been covered in lecture; you are responsible for finding resources to
understand it.




HW3, ©UCB CS 189, Spring 2019. All Rights Reserved. This may not be publicly shared without explicit permission. 1

,2 Gaussian Classification
Let P(x | Ci ) ∼ N(µi , σ2 ) for a two-category, one-dimensional classification problem with classes
C1 and C2 , P(C1 ) = P(C2 ) = 1/2, and µ2 > µ1 .

(a) Find the Bayes optimal decision boundary and the corresponding Bayes decision rule.
(b) The Bayes error is the probability of misclassification,

Pe = P((misclassified as C1 ) | C2 ) P(C2 ) + P((misclassified as C2 ) | C1 ) P(C1 ).

Show that the Bayes error associated with this decision rule is
Z ∞
1
e−z /2 dz
2
Pe = √
2π a
µ2 − µ1
where a = .


Solution:


(a)
P(C1 | x) = P(C2 | x) ⇔
P(x | C1 ) P(C
P(x)
1)
= P(x | C2 ) P(C
P(x)
2)

P(x | C1 ) = P(x | C2 ) ⇔
N(µ1 , σ2 ) = N(µ2 , σ2 ) ⇔
(x − µ1 )2 = (x − µ2 )2
µ1 +µ2
This yields the Bayes decision boundary: x = 2
.
The corresponding decision rule is, given a data point x ∈ R:
µ1 +µ2
• if x < 2
, then classify x in class 1
• otherwise, classify x in class 2
Note that this is the centroid method.
(b)
Z µ1 +µ2
2 1 (x−µ2 )2
P((misclassified as C1 ) | C2 ) = √ e− 2σ2 dx
2πσ
−∞
Z −a
1 z2
= √ e− 2 dz
−∞ 2π
Z +∞ 2
1 z
= √ e− 2 dz
2π a
= Pe


HW3, ©UCB CS 189, Spring 2019. All Rights Reserved. This may not be publicly shared without explicit permission. 2

, Z +∞ (x−µ1 )2
1
P((misclassified as C2 ) | C1 ) = µ1 +µ2
√ e− 2σ2 dx
2 2πσ
Z +∞
1 z2
= √ e− 2 dz
a 2π
= Pe

Therefore:
1 1
P((misclassified as C1 ) | C2 )P(C2 ) + P((misclassified as C2 ) | C1 )P(C1 ) = Pe · + Pe · = Pe
2 2


3 Isocontours of Normal Distributions
Let f (µ, Σ) be the probability density function of a normally distributed random variable in R2 .
Write code to plot the isocontours of the following functions, each on its own separate figure.
You’re free to use any plotting libraries available in your programming language; for instance, in
Python you can use Matplotlib.
   
1 1 0
(a) f (µ, Σ), where µ =   and Σ =  .
1 0 2
   
−1 2 1
(b) f (µ, Σ), where µ =   and Σ =  .
2 1 3
     
0 2 2 1
(c) f (µ1 , Σ1 ) − f (µ2 , Σ2 ), where µ1 =  , µ2 =   and Σ1 = Σ2 =  .
2 0 1 1
       
0 2 2 1 2 1
(d) f (µ1 , Σ1 ) − f (µ2 , Σ2 ), where µ1 =  , µ2 =  , Σ1 =   and Σ2 =  .
2 0 1 1 1 3
       
1 −1 2 0 2 1
(e) f (µ1 , Σ1 ) − f (µ2 , Σ2 ), where µ1 =  , µ2 =  , Σ1 =   and Σ2 =  .
1 −1 0 1 1 2

Solution:
import matplotlib.pyplot as plt
import numpy as np
import scipy.stats

def plot_contours():
fig = plt.figure(figsize=(10,10))
ax0 = fig.add_subplot(111)
ax0.contour(rv.pdf(pos).reshape(500,500))
plt.show()

# Part a

# Generate grid of points at which to evaluate pdf


HW3, ©UCB CS 189, Spring 2019. All Rights Reserved. This may not be publicly shared without explicit permission. 3

The benefits of buying summaries with Stuvia:

Guaranteed quality through customer reviews

Guaranteed quality through customer reviews

Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.

Quick and easy check-out

Quick and easy check-out

You can quickly pay through credit card or Stuvia-credit for the summaries. There is no membership needed.

Focus on what matters

Focus on what matters

Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!

Frequently asked questions

What do I get when I buy this document?

You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.

Satisfaction guarantee: how does it work?

Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.

Who am I buying these notes from?

Stuvia is a marketplace, so you are not buying this document from us, but from seller ExamsConnoisseur. Stuvia facilitates payment to the seller.

Will I be stuck with a subscription?

No, you only buy these notes for $10.99. You're not tied to anything after your purchase.

Can Stuvia be trusted?

4.6 stars on Google & Trustpilot (+1000 reviews)

75759 documents were sold in the last 30 days

Founded in 2010, the go-to place to buy study notes for 14 years now

Start selling
$10.99
  • (0)
  Add to cart