[BSc TN] Summary A Modern Introduction to Probability and Statistics
41 views 2 purchases
Course
Statistiek (WI3104TN)
Institution
Technische Universiteit Delft (TU Delft)
Book
A Modern Introduction to Probability and Statistics
--- Satisfied? Please don't forget to leave a rating! ---
Summary of "A Modern Introduction to Probability and Statistics". This summary covers all important formulas and terminology for statistics and probability, specifically all what is covered in the the course "WI3104TN - Statistiek TN" giv...
Chapter 2: Outcomes, events and probability
§2.1 Sample spaces
Sample spaces are simply sets whose elements describe the outcomes of the exper-
iment in which we are interested. For example, the sample space of a coin toss
is
Ω = {H, T },
and the sample space of all months are
Ω = {Jan, Feb, Mar, Apr, May, Jun, Jul, Aug, Sep, Oct, Nov, Dec}.
The order in which n different objects can be placed is called a permutation of the
n objects. For n objects, there are n! possible permutations.
§2.2 Events
Subsets of the sample space are called events. For example, the event of a month
being long (containing 31 days) is
L = {Jan, Mar, May, Jul, Aug, Oct, Dec}.
For two sets A and B:
• The intersection of two sets is denoted by A ∩ B, and occurs if two A and B
both occur.
• The union of two sets is denoted by A ∪ B, which occurs if at least one of the
events A or B occurs.
• The event Ac = {ω ∈ Ω : ω ∈/ A} is called the complement of A, and occurs if
and only if A does not occur.
• We call events A and B disjoint or mutually exclusive if A and B have no
outcomes in common; in set terminology: A ∩ B = ∅
• The event A implies event B if the outcomes of A also lie in B. In set notation:
A ⊂ B.
DeMorgan’s laws. For any two events A and B we have
(A ∪ B)c = Ac ∩ B c and (A ∩ B)c = Ac ∪ B c .
1
,§2.3 Probability
A probability function P on a finite sample space Ω assigns to each event A in Ω a
number P(A) in [0, 1] such that P(Ω) = 1 and P(A ∪ B) = P(A) + P(B) if A and B
are disjoint. The number P(A) is called the probability that A occurs.
To compute probabilities of events A and B that are not disjoint, we can use
P(A) = P(A ∩ B) + P(A ∩ B c ) and P(A ∪ B) = P(B) + P(A ∩ B c ).
Combining these equations yields the probability of a union: for any two events A
and B we have
P(A ∪ B) = P(A) + P(B) − P(A ∩ B).
For computing probabilities of complements of events, since A ∪ Ac = Ω, we deduce
that
P(Ac ) = 1 − P(A).
Chapter 3: Conditional probability and indepen-
dence
§3.1 Conditional probability
The conditional probability of A given C is given by
P(A ∩ C)
P(A|C)) = ,
P(C)
provided P(C) > 0.
§3.2 The multiplication rule
The multiplication rule. For any events A and C:
P(A ∩ C) = P(A|C) · P(C).
§3.3 The law of total probability and Bayes’ rule
The law of total probability. Suppose C1 , C2 , . . . , Cm are disjoint events such that
C1 ∪ C2 ∪ · · · ∪ Cm = Ω. The probability of an arbitrary event A can be expressed
as
P(A) = P(A|C1 )P(C1 ) + P(A|C2 )P(C2 ) + · · · + P(A|Cm )P(Cm ).
Bayes’ rule. Suppose the events C1 , C2 , . . . , Cm are disjoint and C1 ∪ C2 ∪ · · · ∪ Cm =
Ω. The conditional probability of Ci , given an arbitrary event A, can be expressed
as:
P(A|Ci ) · P(Ci )
P(Ci |A) = .
P(A)
2
, §3.4 Independence
An event A is called independent of B if
P(A|B) = P(A).
Independence. To show that A and B are independent it suffices to prove just one
of the following:
• P(A|B) = P(A),
• P(B|A) = P(B),
• P(A ∩ B) = P(A)P(B),
where A may be replaced by Ac and B replaced by B c , or both. If one of these
statements holds, all of them are true. If two events are not independent, they are
called dependent.
Independence of two or more events. Events A1 , A2 , . . . , Am are called independent
if
and this statement also holds when any number of the events A1 , . . . Am are replaced
by their complements Ac1 , . . . Acm throughout the formula.
Chapter 4: Discrete random variables
§4.1 Random variables
Let Ω be a sample space. A discrete random variable is a function X : Ω → R
that takes on a finite number of values a1 , a2 , . . . , an or an infinite number of values
a1 , a2 , . . . .
§4.2 The probability distribution of a discrete random vari-
able
The probability mass function p of a discrete random variable X is the function
p : R → [0, 1], defined by
p(a) = P(X = a) for − ∞ < a < ∞.
The distribution function F of a random variable X is the function F : R → [0, 1],
defined by
F (a) = P(X ≤ a) for − ∞ < a < ∞.
3
The benefits of buying summaries with Stuvia:
Guaranteed quality through customer reviews
Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.
Quick and easy check-out
You can quickly pay through credit card or Stuvia-credit for the summaries. There is no membership needed.
Focus on what matters
Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!
Frequently asked questions
What do I get when I buy this document?
You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.
Satisfaction guarantee: how does it work?
Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.
Who am I buying these notes from?
Stuvia is a marketplace, so you are not buying this document from us, but from seller rhjatol. Stuvia facilitates payment to the seller.
Will I be stuck with a subscription?
No, you only buy these notes for $6.96. You're not tied to anything after your purchase.