Graded Homework 4

Transcription

Graded Homework 4
Random walks
EPFL - Spring Semester 2014-2015
Graded Homework 4: due Thursday, March 26, at 8:15 AM
Exercise 1. Let (Xn , n ≥ 0) be a time-homogeneous Markov chain on S = {1, 2, 3, . . .} with
transition probabilities:


if j = 2i,
pi
pij = P(Xn+1 = j|Xn = i) = 1 − pi , if j = 1,


0
otherwise.
where
0 < pi < 1 are arbitrary numbers in general (note in particular that we do not ask that
P
i≥1 pi = 1).
Let us start with the simple case where pi ≡ c for some 0 < c < 1.
a) Which states are transient, which are mull-recurrent, which are positive-recurrent? Justify your
answer!
b) Does the chain admit a stationary distribution? If yes, compute it!
Assume now that pi = 1 −
1
i+1
for i ≥ 1.
c) Does the chain admit a stationary distribution in this case? If yes, compute it!
Finally, in the general case:
d) Find an optimal condition on the numbers pi ensuring the existence (and uniqueness) of a stationary distribution.
Exercise P
2. a) Let P
µ and ν be two distributions on a state space S (i.e., µi , νi ≥ 0 for every
i ∈ S and i∈S µi = i∈S νi = 1). Show that the following three definitions of the total variation
distance between µ and ν are equivalent:
1. kµ − νkTV =
1
2
X
|µi − νi |.
i∈S
2. kµ − νkTV = max |µ(A) − ν(A)|, where µ(A) =
A⊂S
3. kµ − νkTV =
1
2
max
P
i∈A µi
|µ(φ) − ν(φ)|, where µ(φ) =
φ:S→[−1,+1]
and ν(A) =
P
i∈S
P
i∈A νi .
µi φi and ν(φ) =
P
i∈S
νi φi .
Hint: The easiest way is to show that 1 ≤ 2 ≤ 3 ≤ 1.
b) Show that kµ − νkTV is indeed a distance (i.e., that it is non-negative, that it is zero if and only
if µ = ν, that it is symmetric and that the triangle inequality is satisfied).
% please turn the page
1
Exercise 3. A complete graph Km with m vertices is a graph with one edge between all pairs of
vertices. We consider the homogeneous random walk (Xn , n ≥ 0) on Km defined by the transition
(n)
probability P(Xn+1 = j|Xn = i) = 1/(m − 1) for all vertices j 6= i. We also denote by µi =
P(Xn = i), i ∈ Km the probability distribution of a random walk at time n with initial condition
(n)
X0 = i0 and by νj = P(Yn = j), j ∈ Km the probability distribution of another independent
random walk with initial condition Y0 = j0 . The two initial vertices are fixed once for all and
distinct, i0 6= j0 .
We now define the following homogeneous Markov process
Km × Km and transition probabilities
 1

 (m−1)2
0
0
1
P(Xn+1 = i , Yn+1 = j |Xn = i, Yn = j) = m−1


0
((Xn , Yn ), n ≥ 0) with state space
if i 6= j andi0 6= i, j 0 6= j,
if j = i and j 0 = i0 6= i,
in all other cases.
It is understood that this Markov process is conditioned on the initial condition (X0 , Y0 ) = (i0 , j0 ).
a) Show that (Xn , Yn ) is a coupling of the two probability distributions µ(n) and ν (n) .
Hint: Recall the definition of coupling; you have to compute the marginals of P(Xn = i, Yn = j).
b) Consider the coalescence time T = inf{n ≥ 1 | Xn = Yn } (a random variable). First show that
for all n ≥ 1:
(m − 2) n
P(T > n) = 1 −
(m − 1)2
and deduce from there that for all n ≥ 1:
m−2
P(T = n) =
(m − 1)2
(m − 2) n−1
1−
(m − 1)2
Hint: rewrite the events {T > n} and {T = n} in terms of the X’s and Y ’s.
c) Consider the total variation distance kµ(n) − ν (n) kTV . Use the formula in point 1 of exercise 2
to show that
n(m−2)
−
kµ(n) − ν (n) kTV ≤ e (m−1)2
Hint: 1 − x ≤ e−x for all x ≥ 0.
d) We remark that πi? = 1/m, i ∈ Km is a stationary distribution for the lazy random walk on the
complete graph. Justify this remark. Use this remark and the previous bound to deduce
kµ(n) − π ∗ kTV ≤ e
e) What happens in the (very) particular case m = 2?
2
−
n(m−2)
(m−1)2

Similar documents