Next: About
this document Up: No
Title Previous: Solution
to Problem 2.
3. [40 points] In each of the examples below, a sequence
,
of random variable is defined. For each example, determine if it converges
in any of the modes we have discussed in class:
If it does converge, describe the limiting random variable or distribution. Justify your answers.
(a) [20 points]
=
where U is a random variable which is uniformly distributed on (0,1), and
denotes the indicator function of the open interval (0,1/n).
Solution: Clearly,
= 1/n, so
. Since convergence in probability implies convergence in distribution,
we know
. In fact, it is clear that
. To see, note that on the event [ U > 0 ] (which has probability 1),
that for all n > 1/U we have
. Thus, with probability 1 the sequence of random variables is ``eventually''
0 (i.e., from some point on, the sequence is zero, but the point where
this occurs is random).
Now as far as convergence in r'th mean, we have
The latter limit holds since the exponential function
goes to infinity faster than the linear function n. Hence,
does not converge in r'th mean to 0 for any r > 0.
REMARK: Some students observed that
and invoked the second Borel Cantelli Lemma to claim the sequence does NOT converge a.s. What is wrong with this argument? The second B.C. lemma requires independence, and these random variables are highly dependent.
(b) [20 points]
where Z is a N(0,1) random variable, U is uniformly distributed on
(0,1), and
=
is the sample mean of i.i.d. random variable
having an exponential distribution with mean 1.
Solution: We claim that
. Now
because each
is N(0,1). Also,
(in fact,
so
). Also, by the Weak Law of Large numbers,
. So, by Slutsky's theorem,
Z, and also by Slutsky's theorem,
Z/1 = Z.
We claim
does not converge in probability. Clearly, for large n,
is approximately equal to
because of the alternating signs of
which is the ``dominant'' term in
. We will show essentially that the sequence violates a Cauchy condition
for convergence in probability. Assume that
(where of course X would have to have a N(0,1) distribution by the last
paragraph and the fact that
implies
). Then we have for any
,
Now
Put otherwise,
Now
where
Now we claim each
=
as
. Assuming this is the case we would have in conjunction with (26)
that
by a homework exercise, but
and the latter has a N(0,4) distribution for all n and so clearly does not converge to 0 in distribution and hence not also in probability. Thus, we have obtained a contradiction, and hence we do not have convergence in probability.
There remains to verify each
. For
, we have
and by a continuous mapping principle,
, and hence also
. Thus, by continuous mapping again we have
=
. Now
=
, so
=
=
by the infamous homework exercise. Similarly
=
=
=
, and similarly for
.
Since
fails to converge in probability, it also cannot converge almost surely
or in r'th mean since convergence in either of these modes would imply
convergence in probability.