STAB52H: Introduction to Probability Fall, 2014 Instructor: Jabed Tomal October 8, 2014

Transcription

STAB52H: Introduction to Probability Fall, 2014 Instructor: Jabed Tomal October 8, 2014
STAB52H: Introduction to Probability
Fall, 2014
Instructor: Jabed Tomal
Department of Computer and Mathematical Sciences
University of Toronto Scarborough
Toronto, ON
Canada
October 8, 2014
Jabed Tomal (U of T)
Probability
October 8, 2014
1 / 26
Joint Distributions:
Definition 2.7.1 If X and Y are random variables, then the joint
distribution of X and Y is the collection of probabilities P((X , Y ) ∈ B),
for all subsets B ⊆ R 2 of pairs of real numbers.
Jabed Tomal (U of T)
Probability
October 8, 2014
2 / 26
Joint Distributions:
Definition 2.7.2 Let X and Y be random variables. Then their joint
cumulative distribution function is the function FX ,Y : R 2 → [0, 1]
defined by
FX ,Y (x, y ) = P(X ≤ x, Y ≤ y ).
The comma stands for “and" here, so that FX ,Y (x, y ) is the probability
that X ≤ x and Y ≤ y .
Jabed Tomal (U of T)
Probability
October 8, 2014
3 / 26
Joint Distributions:
Exercise 2.7.1 Let X ∼ Bernoulli(2/3), and let Y = 4X − 2. Compute
the joint cdf FX ,Y .
The probability function of X is
PX (x) =
2/3 if x = 1
1/3 if x = 0.
The probability function of Y is
2/3 if y = 2
PY (y ) =
1/3 if y = −2.
Jabed Tomal (U of T)
Probability
October 8, 2014
4 / 26
Joint Distributions:
Exercise 2.7.1 Let X ∼ Bernoulli(2/3), and let Y = 4X − 2. Compute
the joint cdf FX ,Y .
Hence, the joint cumulative distribution function of X and Y is

if min{x, (y + 2)/4} < 0
 0
1/3 if 0 ≤ min{x, (y + 2)/4} < 1
FX ,Y (x, y ) =

1
if min{x, (y + 2)/4} ≥ 1.
Jabed Tomal (U of T)
Probability
October 8, 2014
5 / 26
Joint Distributions:
Theorem 2.7.1 Let X and Y be any random variables, with joint
cumulative distribution function FX ,Y . Let B be a subset of R 2 . Then
P((X , Y ) ∈ B) can be determined solely from the values of FX ,Y (x, y ).
Jabed Tomal (U of T)
Probability
October 8, 2014
6 / 26
Joint Distributions:
Theorem 2.7.2 Let X and Y be any random variables, with joint
cumulative distribution function FX ,Y . Suppose a ≤ b and c ≤ d. Then
P(a < X ≤ b, c < Y ≤ d) = FX ,Y (b, d) − FX ,Y (a, d)−
FX ,Y (b, c) + FX ,Y (a, c).
Jabed Tomal (U of T)
Probability
October 8, 2014
7 / 26
Joint Distributions:
Proof Let A and B are two events, and B ⊆ A. Then
P(A ∩ B C ) = P(A) − P(B).
Hence, we can write
P(a < X ≤ b, c < Y ≤ d) = P(X ≤ b, Y ≤ d) −
P{(X ≤ a, Y ≤ d) ∪ (X ≤ b, Y ≤ c)}.
Jabed Tomal (U of T)
Probability
October 8, 2014
8 / 26
Joint Distributions:
Proof (continued) Let A and B are two events, and then using the
principle of inclusion-exclusion we get
P(A ∪ B) = P(A) + P(B) − P(A ∩ B).
Hence, we write
P{(X ≤ a, Y ≤ d) ∪ (X ≤ b, Y ≤ c)} = P(X ≤ a, Y ≤ d) +
P(X ≤ b, Y ≤ c) −
P{(X ≤ a, Y ≤ d) ∩ (X ≤ b, Y
We complete the proof by putting the results together.
Jabed Tomal (U of T)
Probability
October 8, 2014
9 / 26
Marginal Distributions:
Theorem 2.7.3 Let X and Y be two random variables, with joint
cumulative distribution function FX ,Y . Then the marginal cumulative
distribution function FX of X satisfies
FX (x) = lim FX ,Y (x, y ),
y →∞
for all x ∈ R 1 .
Similarly, the marginal cumulative distribution function FY of Y satisfies
FY (y ) = lim FX ,Y (x, y ),
x→∞
for all y ∈ R 1 .
Jabed Tomal (U of T)
Probability
October 8, 2014
10 / 26
Marginal Distributions:
Proof Note that we always have Y ≤ ∞. Hence, using continuity of P,
we have
FX (x) = P(X ≤ x)
= P(X ≤ x, Y ≤ ∞)
=
=
lim P(X ≤ x, Y ≤ y )
y →∞
lim FX ,Y (x, y ),
y →∞
as claimed.
Jabed Tomal (U of T)
Probability
October 8, 2014
11 / 26
Joint Probability Functions:
Definition 2.7.3 Let X and Y be discrete random variables. Then their
joint probability function, pX ,Y , is a function from R 2 to [0, 1], defined by
pX ,Y (x, y ) = P(X = x, Y = y ).
Jabed Tomal (U of T)
Probability
October 8, 2014
12 / 26
Joint Probability Functions:
Example 2.7.4 Let X ∼ Bernoulli(1/2), Y1 = X , and Y2 = 1 − X . Then
1/2 if x = 1
pX (x) =
1/2 if x = 0.
Then the joint probability function

x =y =1
 1/2 if
1/2 if
x =y =0
pX ,Y1 (x, y ) = P(X = x, Y1 = y ) =

0 otherwise
Jabed Tomal (U of T)
Probability
October 8, 2014
13 / 26
Joint Probability Functions:
Example 2.7.4 Let X ∼ Bernoulli(1/2), Y1 = X , and Y2 = 1 − X . Then
Then the joint probability function

x = 1, y = 0
 1/2 if
1/2 if
x = 0, y = 1
pX ,Y2 (x, y ) = P(X = x, Y2 = y ) =

0 otherwise
Jabed Tomal (U of T)
Probability
October 8, 2014
14 / 26
Joint Probability Functions:
Theorem 2.7.4 Let X and Y be two discrete random variables, with
joint probability function pX ,Y . Then the probability function pX of X
can be computed as
X
pX (x) =
pX ,Y (x, y ).
y
Similarly, the probability function pY of Y can be computed as
X
pY (y ) =
pX ,Y (x, y ).
x
Jabed Tomal (U of T)
Probability
October 8, 2014
15 / 26
Joint Probability Functions:
Example 2.7.5 Suppose the joint probability function of X and Y is
given by

1/7 x = 5, y = 0




1/7 x = 5, y = 3



1/7 x = 5, y = 4
pX ,Y (x, y ) =
3/7 x = 8, y = 0





1/7 x = 8, y = 4


0
otherwise.
Jabed Tomal (U of T)
Probability
October 8, 2014
16 / 26
Joint Probability Functions:
Example 2.7.5 The joint probability function of X and Y , and the
marginal probability function of X , and Y can be expressed as below
X =5
X =8
Jabed Tomal (U of T)
Y =0
1/7
3/7
4/7
Y =3
1/7
0
1/7
Probability
Y =4
1/7
1/7
2/7
3/7
4/7
October 8, 2014
17 / 26
Joint Density Functions:
Definition 2.7.4 Let f : R 2 → R 1 be a function.
f is a joint density
R ∞ R Then
∞
function if f (x, y ) ≥ 0 for all x and y , and −∞ −∞ f (x, y )dxdy = 1.
Jabed Tomal (U of T)
Probability
October 8, 2014
18 / 26
Joint Density Functions:
Definition 2.7.5 Let X and Y be random variables. Then X and Y are
jointly abosolutely continuous if there is a density function f , such that
Z
d
Z
P(a ≤ X ≤ b, c ≤ Y ≤ d) =
b
f (x, y )dxdy ,
c
a
for all a ≤ b, c ≤ d.
Jabed Tomal (U of T)
Probability
October 8, 2014
19 / 26
Joint Density Functions:
Example 2.7.6 Let X and Y be jointly absolutely continuous, with joint
density function f given by
4x 2 y + 2y 5 0 ≤ x ≤ 1, 0 ≤ y ≤ 1
f (x, y ) =
0
otherwise
Verify that f is a density function.
Compute P(0.5 ≤ X ≤ 0.7, 0.2 ≤ Y ≤ 0.9).
Jabed Tomal (U of T)
Probability
October 8, 2014
20 / 26
Joint Density Functions:
Exercise 2.7.4 For the following joint density function fX ,Y , find the
value of C and compute fX (x), fY (y ), and P(X ≤ 0.8, Y ≤ 0.6).
2x 2 y + Cy 5 0 ≤ x ≤ 1, 0 ≤ y ≤ 1
fX ,Y (x, y ) =
0
otherwise.
Jabed Tomal (U of T)
Probability
October 8, 2014
21 / 26
Joint Density Functions:
Exercise 2.7.9 Let X and Y have density function
2
(x + y )/4 0 < x < y < 2
fX ,Y (x, y ) =
0
otherwise.
Compute each of the following:
The marginal density fX (x) for all x ∈ R 1 ,
The marginal density fY (y ) for all y ∈ R 1 ,
P(Y < 1).
Jabed Tomal (U of T)
Probability
October 8, 2014
22 / 26
Bivariate Normal Distribution:
Example 2.7.9 Let µ1 , µ2 , σ1 , σ2 , and ρ be real numbers, with
σ1 , σ2 > 0 and −1 ≤ ρ ≤ 1. Let X and Y have density function given by
1√
fX ,Y (x, y ) =
×
2
2πσ
σ
1 2 1−ρ
2 2
y −µ2
y −µ2
x−µ1
x−µ1
1
+ σ2
− 2ρ σ1
exp − 2(1−ρ2 )
σ1
σ2
for x ∈ R 1 , y ∈ R 1 . We say that X and Y have the Bivariate
Normal(µ1 , µ2 , σ1 , σ2 , ρ) distribution.
Jabed Tomal (U of T)
Probability
October 8, 2014
23 / 26
Bivariate Normal Distribution:
If (X , Y ) ∼ Bivariate Normal(µ1 , µ2 , σ1 , σ2 , ρ) distribution, then
X ∼ N(µ1 , σ12 ) and Y ∼ N(µ2 , σ22 ).
The parameter ρ measures the strength of linear relationship
between X and Y and is called correlation.
In particular, X and Y are independent, and so uncorrelated, if
and only if ρ = 0.
Jabed Tomal (U of T)
Probability
October 8, 2014
24 / 26
Bivariate Normal Distribution:
Figure: Bivariate normal density function.
Jabed Tomal (U of T)
Probability
October 8, 2014
25 / 26
Bivariate Normal Distribution:
Theorem 2.7.6 Let X and Y be jointly absolutely continuous random
variables, with joint density fX ,Y , and let B ⊆ R 2 be any region. Then
Z Z
P((X , Y ) ∈ B) =
fX ,Y (x, y )dxdy .
B
Jabed Tomal (U of T)
Probability
October 8, 2014
26 / 26

Similar documents