next up previous
Next: About this document ... Up: Solutions to Final Exam Previous: Solution to Problem 5.


Solution to Problem 6.

(6) [15 points] The Pareto($\theta$) family of distributions has pdf given by

\begin{displaymath}
f(x\vert\theta) \; = \; \theta x^{-(\theta+1)} I_{(1,\infty)} (x) ,
\end{displaymath}

where $\theta > 0$. (a) Verify that this is a legitimate pdf.

Solution:

\begin{eqnarray*}
\int_1^{\infty} \; \theta x^{-(\theta+1)} \; dx
& \; = \; &
...
...]_{x=1}^{\infty} \\
& \; = \; & 0 - (-1) \\
& \; = \; & 1 .
\end{eqnarray*}

Clearly $f(x\vert\theta) \ge 0$, so it is a legitimate pdf.


(b) Is this an exponential family?

Solution:

\begin{displaymath}
f(x\vert\theta) \; = \; \theta \exp[ - \theta \log x ] x^{-1} I_{(1,\infty)} (x)
\end{displaymath}

which is an exponential family with

\begin{eqnarray*}
c(\theta) & \; = \; & \theta \\
w(\theta) & \; = \; & - \th...
...; = \; & \log x \\
h(x) & \; = \; & x^{-1} I_{(1,\infty)} (x)
\end{eqnarray*}




(c) Does this distribution have a moment generating function that is finite in a neighborhood of the origin?

Solution: Note that

\begin{displaymath}
\int_1^{\infty} \, x^{\theta + 1} f(x\vert\theta) \, dx
\; = \; \theta \int_1^{\infty} \, 1 \, dx \; = \; \infty .
\end{displaymath}

That is, the $\theta + 1$ moment is not finite. If a distribution has a mgf that is finite in a neighborhood of $0$, then it has moments of all orders. Since this distribution does not have moments of all orders, it cannot have a mgf that is finite in a neighborhood of $0$.


(d) Suppose we have $n$ i.i.d. observations from the Pareto($\theta$) family with $\theta$ unknown. Find the maximum likelihood estimator of $\theta$.

Solution: The log likelihood is

\begin{displaymath}
\log L(\theta) \; = \; n \log \theta \, - \, \theta \sum_{i=1}^n \log X_i
\end{displaymath}

except for the irrelevant terms involving the $h(X_i)$. Since this is an exponential family, we can take derivatives and set to $0$ to find the mle (otherwise, we would have to check that it gives a maximum):

\begin{displaymath}
n/\theta \, - \, \sum_{i=1}^n \log X_i
\; = \; 0 \; \Long...
...= \;
\left[ \frac{1}{n} \sum_{i=1}^n \log X_i \right]^{-1}.
\end{displaymath}




(d) A Bayesian wants to make inferences about $\theta$. He uses a gamma($\alpha$,$\beta$) prior for $\theta$. Show that the posterior for $\theta$ is also a gamma distribution and find its parameters. Also, find the posterior mean of $\alpha$.

Solution: The posterior is

\begin{displaymath}
f(\theta\vert x_1,\ldots,x_n) \; \propto \;
\theta^{\alpha-1+n} \exp[-\theta(1/\beta + \sum_{i=1}^n \log x_i) ] .
\end{displaymath}

This is a

\begin{displaymath}
\mbox{gamma} \left(\alpha+n, \left[ 1/\beta + \sum_{i=1}^n \log x_i \right]^{-1} \right)
\end{displaymath}

pdf. The posterior mean of $\theta$ is

\begin{displaymath}
\frac{\alpha +n}{1/\beta + \sum_{i=1}^n \log x_i} .
\end{displaymath}


next up previous
Next: About this document ... Up: Solutions to Final Exam Previous: Solution to Problem 5.
Dennis Cox 2003-01-18