ISyE 6644 — Fall 2014

Transcription

ISyE 6644 — Fall 2014
1
ISyE 6644 — Fall 2014
Homework #5 Solutions
1. Suppose the c.d.f. of X is F (x) = x4 /16, 0 ≤ x ≤ 2. Develop a generator for X.
Demonstrate with U = 0.54.
Solution: Set F (X) = U ∼ Unif(0, 1). Then U = X 4 /16, and so X = 2U 1/4 .
For U = 0.54, we get X = 1.71.
2. Suppose the p.d.f. of X is
f (x) =
1
(x − 2),
2
1
(2 − x3 )
2
if 2 ≤ x ≤ 3
.
if 3 ≤ x ≤ 6
Develop a generator for X. Generate 10 values of X, compute the sample mean,
and compare it to the true mean of the distribution.
Solution: This is a Tria(2,3,6) distribution. See p. 300 of Law, which uses the
notation a = 2, b = 6, and m = 3. Then after a bit of algebra (or after taking the
results directly out of the book), we get the following c.d.f.


 0x−2 if x < 2

, if 2 ≤ x ≤ 3
2
.
F (x) =
2− x3

if 3 ≤ x ≤ 6

2

1
if x > 6
Setting F (X) = U for both cases above, we get
√
2 + 2p U ,
if 0 ≤ U ≤ 1/4
X =
. 6 − 12(1 − U ) if 1/4 ≤ U ≤ 1
By the material in the text (or via some algebra), E[X] = (a + b + c)/3 = 11/3, and
you should compare your sample mean of 10 observations to this number. 2
3. Do a few of the following random variate generation problems from Law, pp.
479–483: #8.1(a,c), 8.8, 8.11(a), 8.12, 8.13, 8.14.
• 8.1(a): Give an algorithm for generating a generalized Cauchy RV with p.d.f.
( "
2 #)
x−γ
f (x) = πβ 1 +
, x, γ ∈ R, β > 0.
β
Solution: Let’s do this via inverse transform. If we pay close attention to the
limits of integration, we find that the c.d.f. is
Z x
1
x−γ
1
f (t) dt = arctan
+ .
F (x) =
π
β
2
−∞
Setting F (X) = U , we obtain
1
X = γ + β tan π U −
. 2
• 8.1(c): Give an algorithm for generating a logistic RV with p.d.f.
f (x) =
(1/β) exp{−(x − γ)/β}
,
[1 + exp{−(x − γ)/β}]2
x, γ ∈ R, β > 0.
Solution: After a little bit of calculus, we see that the c.d.f. is
h
i−1
F (x) = 1 + e−(x−α)/β .
Then a little more algebra gives
1
X = F −1 (U ) = −β `n
− 1 + α. U
• 8.8: In each of the following cases, give an algorithm that uses exactly one
PRN to generate X.
(a) X = min(U1 , U2 ), where U1 and U2 are i.i.d. Unif(0,1).
Solution: By independence and uniformity,
P(X > x) = P(U1 > x, U2 > x) = P(U1 > x)P(U2 > x) = (1 − x)2 .
√
Thus, F (x) = 1−(1−x)2 , and then a little algebra gives X = 1− 1 − U .
3
(b) X = max(U1 , U2 ), where U1 and U2 are i.i.d. Unif(0,1).
Solution: Similarly,
F (x) = P(X ≤ x) = P(U1 ≤ x, U2 ≤ x) = P(U1 ≤ x)P(U2 ≤ x) = x2 .
√
Then we have X = U . (c) X = min(Y1 , Y2 ), where Y1 and Y2 are i.i.d. Exp(1/β).
Solution: Since the distributions are i.i.d. exponential,
P(X > x) = P(Y1 > x, Y2 > x) = P(Y1 > x)P(Y2 > x) = e−2x/β .
Thus, X ∼ Exp(2/β). (You might remember this result from stochastic
processes class.) So we can generate X = −(β/2) `n(U ). Comment: In all of the above methods, we only needed to use one Unif(0, 1)
instead of two, so perhaps we can save some computer time. On the other
hand, we had to take (slow) square roots in (a) and (b), so maybe the savings
in those cases aren’t so great. • 8.11(a): Let’s generate RV’s with p.d.f. f (x) = 3x2 /2 for −1 ≤ x ≤ 1.
Use the following methods: inverse transform, composition, and acceptancerejection. Which is preferable?
Solution: First of all, let’s do inverse transform. After a little careful
calculus (watch your limits), we have F (x) = (x3 + 1)/2 for −1 < x < 1.
Then X = (2U − 1)1/3 . Now composition. The p.d.f. is a parabola with its minimum at x = 0, so this
is a good place to do the decomposition. Then we have
f (x) = 0.5I[−1,0] (x)3x2 + 0.5I[0,1] (x)3x2 ,
where I is the indicator function. So here’s the algorithm:
(a) Generate U1 and U2 i.i.d. Unif(0, 1).
1/3
1/3
(b) If U1 ≤ 0.5, set X = −U2 . Otherwise, set X = U2 .
Finally, acceptance-rejection. Let’s use the majorizing function t(x) = 1.5,
−1 < x < 1 (t(x) simply is the maximum of the p.d.f. f (x)). Then we have
c = 3, r(x) is the Unif(−1, 1) p.d.f., and the algorithm is
4
(a) Generate U 0 ∼ Unif(0, 1) and set Y = 2U 0 − 1.
(b) Generate U ∼ Unif(0, 1), independently of U 0 .
(c) If U ≤ Y 2 , return X = Y . Otherwise, go to Step 1, and try again.
• 8.12: Consider the polar method from the notes (and §8.3.6 of Law) for
generating standard normal RV’s. Show that the probability of acceptance of
W in Step 2 is π/4, and find the distribution of the number of rejections of W
before acceptance finally occurs. What is the expected number of executions
of Step 1?
Solution:
P(accept) = P(V12 + V22 ≤ 1) = P((2U1 − 1)2 + (2U2 − 1)2 ≤ 1)
= P((U1 − 1/2)2 + (U2 − 1/2)2 ≤ 1/4)
Area of circle centered at (1/2,1/2) with radius 1/2
=
Area of unit square
= π/4. Since the number of rejections is Geom(π/4), the expected number of
rejections is 4/π. • 8.13: Give a composition algorithm for generating from the Tria(0,m,1)
distribution (0 < m < 1). Compare it with the inverse transform method.
Solution: Chop the triangular distribution into a left-triangular distribution
and a right-triangular distribution (as suggested in the Hint about Problem
8.7). Then do composition with these two pieces. • 8.14: You’ll need the book for this one.
Solution: Did this in class.
4. Demonstrate that you can generate Nor(0,1) random variates. Show me at least
five different methods. Provide histograms of your results.
Solution: Lots of methods that we did (or mentioned) in class. . .
5
•
P12
i=1
−1
Ui − 6 (approximate)
• Φ (U ) (need to do table look-up)
• Box–Muller
• Polar method
• A-R
• etc.
5. Suppose X ∼ Pois(3). Generate 5000 values of X. Provide a histogram.
6. Let’s compare the run times of two Geom(p) generation techniques: (A) running
Bern(p) trials until you get a success, and (B) X = d`n(U )/`n(1 − p)e. To do so,
simply time how long it takes to generate 100 million RV’s using each method on
your computer. Make sure that you’re not running anything else on your computer
when you do the comparison runs.
7. Let’s compare the run times of two standardP
normal generation techniques: (A)
12
the desert-island CLT approximation Z =
i=1 Ui − 6, and (B) Box–Muller.
Again, you should time how long it takes to generate 100 million RV’s using each
method on your computer. Hints: Make sure that you’re not running anything else
on your computer when you do the comparison runs. And note that Box–Muller
gives you two Nor(0,1)’s at-a-time. Also, just to make sure that the methods
are giving you reasonable Nor(0,1)’s you may want to look at histograms of the
outputs or do some formal statistical test for normality.
8. Generate and plot 10000 bivariate normal random variates (X, Y ), where both
X and Y are standard normal but such that the correlation between X and Y is 0.9.