Welcome to our community

Be a part of something great, join today!

Problem of the Week #3 - April 16th, 2012

Status
Not open for further replies.
  • Thread starter
  • Moderator
  • #1

Chris L T521

Well-known member
Staff member
Jan 26, 2012
995
Thanks again to those that participated in the second round of our POTW! Now, it's time for the third one! (Bigsmile)

This week's problem was proposed by yours truly.

-----

Problem: Let $X_i,\, (i=1,\ldots,n)$ be a (continuous) random variable of the exponential distribution $\text{Exp}(\lambda)$, where it's probability density function (p.d.f.) is defined by

\[f(x) = \left\{\begin{array}{cl}\lambda e^{-\lambda x} & x\geq 0,\,\lambda >0\\ 0 & x<0\end{array}\right.\]

Show that $\sum_{i=1}^n X_i$ is equivalent to a random variable of the Gamma distribution $\Gamma(n,\theta)$, where the p.d.f. of the Gamma distribution is given by

\[f(x) = \left\{\begin{array}{cl}\frac{1}{\theta^n\Gamma(n)}x^{n-1}e^{-x/\theta} & x\geq 0,\,\theta>0,\, n\in\mathbb{Z}^+\\ 0 & x<0\end{array}\right.\]

-----

Here are two hints:

If $X$ is a continuous random variable, we define the moment generating function by

\[M_X(t) = E[e^{tX}] = \int_{-\infty}^{\infty}e^{tx}f(x)\,dx\]

where $f(x)$ is the p.d.f. of the random variable $X$. Use the fact that if $\{X_i\}_{i=1}^n$ is a collection of random variables, then

\[M_{\sum_{i=1}^n X_i}(t) = \prod_{i=1}^n M_{X_i}(t)\]

Recall that $\Gamma(x) = \displaystyle\int_0^{\infty}e^{-t}t^{x-1}\,dx$.

Remember to read the POTW submission guidlines to find out how to submit your answers!

EDIT: I forgot to mention that each $X_i$ are i.i.d. random variables. If they're not, then the above result doesn't hold (thanks to girdav for pointing this out).
 
Last edited:
  • Thread starter
  • Moderator
  • #2

Chris L T521

Well-known member
Staff member
Jan 26, 2012
995
Sadly, no one answered the question gave a correct solution this week!

Here's the solution to the problem.

By a simple calculation, we can show that if $X_i\sim \text{Exp}(\lambda)$, then

\[\begin{aligned}M_{X_i}(t) &= \int_0^{\infty}e^{tx}\lambda e^{-\lambda x}\,dx\\ &= \lambda\int_0^{\infty}e^{-(\lambda-t)x}\,dx\\ &= \frac{\lambda}{\lambda-t}\int_0^{\infty}e^{-u}\,du\,\quad \text{(by making the substitution $u=(\lambda-t)x$}\\ &= \frac{\lambda}{\lambda-t}=\frac{1}{1-(t/\lambda)}\end{aligned}\]

Let $X\sim\Gamma(n,\theta)$. Then

\[\begin{aligned}M_X(t) &= \frac{1}{\theta^n\Gamma(n)}\int_0^{\infty}x^{n-1}\exp\left(-\left(\frac{1}{\theta}-t\right)x\right)\,dx\\
&= \frac{1}{\theta^n\Gamma(n)} \int_0^{\infty}\frac{1}{(1/\theta - t)^n}u^{n-1}e^{-u}\,du\quad\text{(use the substitution $u=(1/\theta - t)x$}\\ &= \frac{1}{\theta^n\Gamma(n)(1/\theta- t)^n}\Gamma(n)\quad\text{(by definition of the Gamma function)}\\ &=\frac{1}{\theta^n(1/\theta - t)^n}=\frac{1}{(1-t\theta)^n}.\end{aligned}\]

Thus,

\[\begin{aligned}M_{\sum_{i=1}^n X_i}(t) &= \prod_{i=1}^n M_{X_i}(t)\\ &= \prod_{i=1}^n \frac{1}{1 - (t/\lambda)}\\ &= \frac{1}{(1-(t/\lambda))^n}\end{aligned}\]

If we make the substitution $\frac{1}{\theta}=\lambda$, then we see that

\[M_{\sum_{i=1}^n X_i}(t) = \frac{1}{(1-t\theta)^n} = M_X(t)\]

where $X\sim\Gamma(n,\theta)$.

Therefore, $X=\sum_{i=1}^n X_i.\qquad\blacksquare$
 
Last edited:
Status
Not open for further replies.