Summary Introduction to Probability for Probability Theory for EOR course
42 views 1 purchase
Course
Probability Theory (EBP014B05)
Institution
Rijksuniversiteit Groningen (RuG)
Book
Introduction to Probability, Second Edition
Summary of the book Introduction to Probability for the course Probability Theory for EOR. The first six chapters are covered in detail in the summary.
Samenvatting Introduction to Probability, Blitzstein & Hwang - Probability Theory
PROBABILITY THEORY FOR EOR | Summary (RUG)
All for this textbook (4)
Written for
Rijksuniversiteit Groningen (RuG)
Econometrics And Operations Research
Probability Theory (EBP014B05)
All documents for this subject (3)
Seller
Follow
FreekeBoerrigter
Reviews received
Content preview
Probability Theory
Freeke Boerrigter
Chapter 1 & 2 – Probability and Counting
¿ ¿
Naïve definition of probability: Pnaïve = ¿ A∨
¿ S∨¿ ¿
A – the event that A occurs
S – sample space, all possible outcomes
Ac – the event that A does not occur
Pnaïve(Ac) = 1 – Pnaïve(A)
The binomial coefficient formula: ()
n = n!
k ( n−k ) ! k !
where k is picked out of a
total set n.
Choosing the complement: for any nonnegative integers n and k with k ≤ n we
have (nk)=(n−k
n
)
Non-naïve definition of probability – a probability space consists of a sample
space S and a probability function P which takes an event A ⊆ S as input and
returns P(A), a real number between 0 and 1, as output. The function P must
satisfy the following axioms:
1. P(∅) = 0 and P(S) = 1
∞
2. P( ¿ j=1 ¿ ∞ A j ) =∑ P ( A j ) -> this simply means the union of all probabilities
j=1
of A is the sum of all the probabilities of A
Multiplication Rule -> if A does not affect B (independent), then P(A and B) =
P(A)P(B)
Some Events:
P(Ac) = 1 – P(A)
If A⊆ B, then P(A) ≤ P(B)
P(A ∪ B) = P(A) + P(B) – P(A ∩ B)
P(S) = P(A) + P(Ac) = 1
Chapter 2 – Conditional Probability
Two events are dependent if knowing one of the outcomes occurred, affects the
other outcome.
The Conditional Probability of A given B is P(A|B)
P ( A ∩ B)
P ( A|B )=
P ( B)
We call P(A) the prior probability of A and P(A|B) the posterior probability
of A
P ( B∨A ) P ( A )
Bayes’ Rule: P ( A|B )=
P( B)
n
The Law of Total Probability (LOTP): P ( B )=∑ P ( B∨ Ai ) P( Ai )
i=1
1
, Simpson’s paradox occurs when groups of data show a particular trend, but
when the data is combined, the trend reverses.
Example: on Saturday you get 7/8 points (87.5%), and your friend gets 2/2 points
(100%). On Sunday, you get 1/2 points (50%), and your friend gets 5/8 points
(62.5%). Both days, your friend has a higher proportion of points however when
you combine the days you have 8/10 points, and your friend has 7/10 points.
This is the paradox, combining groups of data reverses the trend.
Chapter 3 – Random Variables and their Distributions
Probability Mass Function (PMF) is a function that gives the probability that
a discrete random variable is exactly equal to some value.
Bernoulli Distribution only has two outcomes, success or failure. Example is
tossing a coin, getting a head is chance 0.5 and getting a tail is change 1 – p = 1
– 0.5 = 0.5.
Random variable X with parameter p if P(X = 1) = p and P(X = 0) = 1 – p.
Where 0 < p < 1.
X Bern(p)
Binomial Distribution is the outcome of a Bernoulli distribution repeated
multiple times. It has two possible outcomes.
Let X be the number of successes, and n and p be the parameters where n
is a positive integer and 0 < p < 1.
X Bin(n, p)
The PMF of X if X Bin(n, p) is P(X = k) = (nk ) p ( 1− p)
k n−k
Hypergeometric Distribution is used when you want to determine the
probability of obtaining a certain number of successes without replacement from
a specific sample size.
X HGeom(w, b, n) -> w and b come from white and black balls
from the urn
The PMF of X if X HGeom(w, b, n) is P(X = k) =
( k )( n−k )
w b
for
(n)
w+ b
integers k satisfying 0≤ k ≤ w and 0 ≤ n – k ≤ b and P(X = k) = 0.
Discrete Uniform Distribution says that all outcomes are equally likely out of
a finite, nonempty set of numbers.
X DUnif(C) if the chosen number is X
1
The PMF of X if X DUnif(C) is P(X = x) =
¿ C∨¿ ¿
Cumulative Distribution Function (CDF) is a function that gives the
probability that any random variable is exactly equal to some value.
Any function of a random variable is also a random variable itself.
2
The benefits of buying summaries with Stuvia:
Guaranteed quality through customer reviews
Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.
Quick and easy check-out
You can quickly pay through credit card or Stuvia-credit for the summaries. There is no membership needed.
Focus on what matters
Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!
Frequently asked questions
What do I get when I buy this document?
You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.
Satisfaction guarantee: how does it work?
Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.
Who am I buying these notes from?
Stuvia is a marketplace, so you are not buying this document from us, but from seller FreekeBoerrigter. Stuvia facilitates payment to the seller.
Will I be stuck with a subscription?
No, you only buy these notes for $7.02. You're not tied to anything after your purchase.