100% satisfaction guarantee Immediately available after payment Both online and in PDF No strings attached
logo-home
ENCE 742 Affan Probability ITDS Assignment 1-5 University of Maryland £21.36   Add to cart

Other

ENCE 742 Affan Probability ITDS Assignment 1-5 University of Maryland

 5 views  0 purchase

ENCE 742 Affan Probability ITDS Assignment 1-5 University of Maryland/ENCE 742 Affan Probability ITDS Assignment 1-5 University of Maryland/ENCE 742 Affan Probability ITDS Assignment 1-5 University of Maryland

Preview 3 out of 20  pages

  • April 22, 2024
  • 20
  • 2023/2024
  • Other
  • Unknown
All documents for this subject (1)
avatar-seller
Millenialnurse
Information and Data Science Project
Introduction/ Scope

The document serves as a place holder for the ITDS assignment 1 to 5

Content

1. Assignment 1 - - - - - - - - - 2

Entropy
Test Entropy
Joint Entropy
Conditional Entropy Mutual Entropy
Normalized Version of entropy and mutual information

2. Assignment 2 - - -- - - - - - - 6

Difference between entropy of Discrete RV
Differential Entropy
Differential Entropy of Gaussian and Estimated PDF

3. Assignment 3 - - - - - - - - - 10

PMF of Iris features
Entropy of the features of Irish dataset
Mutual information between pairs of features

4. Assignment 4 - - - - - - - - - 12

Bayes Classifier Function: continuous Random variable (RV)
Naïve Bayes Classifier function for continuous and independent RV
Naïve Bayes Classifier function for continuous, independent, and Gaussian
distributed RV
Computation of Comparism of average accuracy

===============================================
==================
Annexes - - - - - - - - - 16

List of the scripts and functions

References




Chapter 1

Assignment 1

1/20

,This assignment request for the following tasks

1. A function called ’entropy" which computes the entropy of a discrete random
variable given its probability mass function [p1; p2; :::; pN].

2. A script called ‘test entropy2’ which computes the entropy for a generic
binary random variable as a function of p0 and plots the entropy function.

3. write a function called joint entropy" which computes the joint entropy of two
generic discrete random variables given their joint p.d.f.

4. write a function called conditional entropy" which computes the conditional
entropy of two generic discrete random variables given their joint and
marginal p.d.f.

5. write a function called mutual information" which computes the mutual
information of two generic discrete random variables given their joint and
marginal p.d.f.

6. write the functions for normalized versions of conditional entropy, joint
entropy and mutual information for the discrete case.

Entropy
A function called ’entropy" computes the entropy of a discrete random variable
given its probability mass function pmf. It accepts an array probability vector and
outputs a single value for the entropy.

The function performs in accordance to the following equation
Nx Nx
1
H ( x )=H NX ( P1 , P 2 , … , PNx ) ≝ ∑ p x∗log =−∑ p x∗log px
k=1 px k=1
Where
p x =probability of a particular outcome
H ( x )=The Entropy of the univeriate pmf

Running the script

fX = [0.2, 0.3, 0.1, 0.1, 0.1, 0.2]

Hx = entropy.entropy(fX)
print('The entropy of the given pmf fX = [0.2, 0.3, 0.1, 0.1, 0.1, 0.2] is : ',
Hx)

>>> output
The entropy of the given pmf fX = [0.2, 0.3, 0.1, 0.1, 0.1, 0.2]is :
2.4464393446710155

Test entropy2

2/20

, The ‘test_entropy2’ is a script that computes the entropy for a generic binary
random variable as function of p0 and plots the entropy function. The script request
for an input of a singular value of p for the range ( 0 < p < 1) and yield an output.

It also plots all possible distribution of the range 0 < p < 1

For the random variable X, X = 0 (HEAD), X = 1 (TAIL)

P ( X=0 )= p , P ( X=1 )=q
N
sumation of a probabilities ∑ Px=1
i=1
→ q=1− p
1 1
We can compute theentropy H ( x )= p∗log 2 + ( 1−p ) log 2 [bits]
p ( 1− p )

Running the script

What is the value of your random probability? P0 = 0.3

>>> output

0.8812908992306926




Joint entropy
This function called “joint_ entropy" computes the joint entropy of two generic
discrete random variables given their joint p.d.f.

The function follows the following equation.
+∞ +∞
1
Joint entropy h ( X , Y )= ∫ ∫ f XY ( x , y ) log 2
−∞ −∞ f XY ( x , y )



3/20

The benefits of buying summaries with Stuvia:

Guaranteed quality through customer reviews

Guaranteed quality through customer reviews

Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.

Quick and easy check-out

Quick and easy check-out

You can quickly pay through credit card for the summaries. There is no membership needed.

Focus on what matters

Focus on what matters

Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!

Frequently asked questions

What do I get when I buy this document?

You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.

Satisfaction guarantee: how does it work?

Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.

Who am I buying these notes from?

Stuvia is a marketplace, so you are not buying this document from us, but from seller Millenialnurse. Stuvia facilitates payment to the seller.

Will I be stuck with a subscription?

No, you only buy these notes for £21.36. You're not tied to anything after your purchase.

Can Stuvia be trusted?

4.6 stars on Google & Trustpilot (+1000 reviews)

72042 documents were sold in the last 30 days

Founded in 2010, the go-to place to buy revision notes and other study material for 14 years now

Start selling
£21.36
  • (0)
  Add to cart