Unravelling the Mystery of MGFs: Exploring Moments and More

In summary, the conversation discusses the concept of MGF and how it can be used to find the mean and variance of a random variable. The formula for MGF is given and it is explained how it can be used to find the first, second, and nth moments. The conversation also mentions the convergence of the integral and how it is related to the value of t in the formula.
  • #1
nacho-man
171
0
Please refer to the attached image.The concept of MGF still plagues me.

I got an invalid answer when i tried this.

What i did was:

$ \int e^{tx}f_{X}(x)dx $
= $ \int_{-\infty}^{+\infty} e^{tx}(p \lambda e^{-\lambda x} + (1-p)\mu e^{-x\mu})dx$

I was a bit wary at this point, because it reminded me of the bernoulli with the p and (1-p) but i could not find any relation for this.

i separated the two integrals, and ended up with
$ p \lambda \int_{-\infty}^{+\infty}e^{tx-x\lambda}dx + ... $ which i knew was immediately wrong because that integral does not converge.
What did i do wrong.

What does the MGF even tell us. First, second, nth moment, what does this mean to me?
 

Attachments

  • Untitled.jpg
    Untitled.jpg
    8.7 KB · Views: 47
Physics news on Phys.org
  • #2
nacho said:
Please refer to the attached image.The concept of MGF still plagues me.

I got an invalid answer when i tried this.

What i did was:

$ \int e^{tx}f_{X}(x)dx $
= $ \int_{-\infty}^{+\infty} e^{tx}(p \lambda e^{-\lambda x} + (1-p)\mu e^{-x\mu})dx$

I was a bit wary at this point, because it reminded me of the bernoulli with the p and (1-p) but i could not find any relation for this.

i separated the two integrals, and ended up with
$ p \lambda \int_{-\infty}^{+\infty}e^{tx-x\lambda}dx + ... $ which i knew was immediately wrong because that integral does not converge.
What did i do wrong.

What does the MGF even tell us. First, second, nth moment, what does this mean to me?

By definition is...

$\displaystyle M(t) = E \{ e^{t\ X} \} = \int_{- \infty}^{+ \infty} f(x)\ e^{t\ x}\ dx = \int_{0}^{\infty} \{p\ \lambda\ e^{- \lambda\ x} + (1-p)\ \mu\ e^{- \mu\ x}\ \}\ e^{t\ x}\ d x = \frac{p}{1 - \frac{t}{\lambda}} + \frac{1-p}{1-\frac{t}{\mu}}\ (1)$

The knowledge of M(t) permit us to find mean and variance of X with the formula...

$\displaystyle E \{X^{n}\} = M^{(n)} (0)\ (2)$

... so that is...

$\displaystyle E \{X\} = \frac{p}{\lambda} + \frac{1-p}{\mu}\ (2)$

$\displaystyle E \{X^{2}\} = \frac{2\ p}{\lambda^{2}} + \frac{2\ (1-p)}{\mu^{2}}\ (3)$

$\displaystyle \sigma^{2} = E \{X^{2} \} - E^{2} \{ X \} = \frac{2\ p - p^{2}}{\lambda^{2}} + \frac{2\ (1-p) - (1-p)^{2}}{\mu^{2}} - 2\ \frac{p\ (1-p)}{\lambda\ \mu}\ (4)$

Kind regards

$\chi$ $\sigma$
 
  • #3
chisigma said:
By definition is...

$\displaystyle M(t) = E \{ e^{t\ X} \} = \int_{- \infty}^{+ \infty} f(x)\ e^{t\ x}\ dx = \int_{0}^{\infty} \{p\ \lambda\ e^{- \lambda\ x} + (1-p)\ \mu\ e^{- \mu\ x}\ \}\ e^{t\ x}\ d x = \frac{p}{1 - \frac{t}{\lambda}} + \frac{1-p}{1-\frac{t}{\mu}}\ (1)$

$\chi$ $\sigma$
I don't see how this integral converges, how did you get that answer
 
  • #4
nacho said:
I don't see how this integral converges, how did you get that answer

Is...

$\displaystyle \lambda\ \int_{0}^{\infty} e^{- (\lambda-t)\ x}\ d x = \frac{\lambda}{t - \lambda} |e^{- (\lambda-t)\ x}|_{0}^{\infty} = \frac{1}{1-\frac{t}{\lambda}}\ (1)$

... and [of course...] the integral in (1) converges if $\displaystyle t< \lambda$. That is not a disavantage because from the pratical point of view what matters in the behaviour of M(t) in t=0...

Kind regards

$\chi$ $\sigma$
 
  • #5


As a scientist, it is important to approach problems with a critical and analytical mindset. In this case, the issue seems to be with the integration of the MGF formula. It is important to carefully check the steps and make sure all assumptions and conditions are met before plugging in values.

Additionally, it is important to understand the concept of MGF and its applications. The MGF, or moment generating function, is a powerful tool in probability and statistics that allows us to calculate moments of a random variable. The moments, such as the first, second, and nth moment, provide important information about the distribution of the random variable.

In this case, it appears that the MGF formula was not properly applied, leading to an invalid answer. It is important to carefully review the formula and make sure all variables and constants are correctly identified and accounted for.

Furthermore, it is important to understand the relationship between the MGF and other probability distributions, such as the Bernoulli distribution. This can provide insight into how the MGF can be used to calculate moments and other important properties of a random variable.

In conclusion, the MGF can be a complex concept, but with careful application and understanding, it can provide valuable insights and help unravel mysteries in the world of probability and statistics.
 

Related to Unravelling the Mystery of MGFs: Exploring Moments and More

1. What is an MGF and how does it relate to moments?

An MGF, or Moment Generating Function, is a mathematical function used in probability theory to determine the moments of a probability distribution. Moments, on the other hand, are numerical measures that describe the shape, location, and scale of a probability distribution. The MGF is a powerful tool for finding the moments of a distribution.

2. How do MGFs help us understand probability distributions?

MGFs help us better understand probability distributions by providing a way to calculate their moments, which are important characteristics of a distribution. These moments, such as the mean and variance, give us insight into the behavior of the distribution and can be used to make predictions about future outcomes.

3. Can MGFs be used for any type of probability distribution?

Yes, MGFs can be used for any type of probability distribution, as long as the distribution has a well-defined MGF. This includes commonly used distributions such as the normal, exponential, and Poisson distributions.

4. How do MGFs differ from other methods of finding moments?

MGFs are often preferred over other methods of finding moments, such as moment generating techniques, because they are more efficient and can be applied to a wider range of distributions. Additionally, MGFs can be used to find all moments of a distribution, while other methods may only be able to find a limited number of moments.

5. What are some real-world applications of MGFs?

MGFs have a wide range of applications in various fields, including finance, physics, and biology. In finance, MGFs are used to model stock prices and predict financial outcomes. In physics, MGFs are used to analyze and predict the behavior of particles in quantum mechanics. In biology, MGFs are used to model the growth and development of populations and predict evolutionary outcomes.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
0
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
4K
  • Calculus and Beyond Homework Help
Replies
6
Views
354
Replies
19
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
2K
Replies
1
Views
1K
  • Classical Physics
Replies
0
Views
242
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
Back
Top