Show that the maximum likelihood estimator is unbiased

In summary: X_i\right)\end{align*}In summary, the MLE is ${\mu}_{1}=\frac{1}{n}(\ln(X_{1})+...+\ln(X_{n}))$, and it is unbiased.
  • #1
Fermat1
187
0
Consider a density family $f(x,{\mu})=c_{{\mu}}x^{{\mu}-1}\exp(\frac{-(\ln(x))^2)^2}{2}$ , where $c_{{\mu}}=\frac{1}{{\sqrt{2{\pi}}}}\exp(-{\mu}^2/2)$
For a sample $(X_{1},...,X_{n})$ fnd the maximum likelihood estimator and show it is unbiased. You may find the substitution $y=\ln x$ helpful.

I find the MLE to be ${\mu}_{1}=\frac{1}{n}(\ln(X_{1})+...+\ln(X_{n}))$. For unbiasedness, I'm not sure what to do. If I substitute $y_{i}=\ln(x_{i}$ I get $E({\mu}_{1})=\frac{1}{n}(E(Y_{1})+...+E(Y_{n}))$. Am I meant to recognise the distribution of the $Y_{i}$?
 
Last edited:
Physics news on Phys.org
  • #2
I'm not sure what \(\displaystyle f(x,\mu)\) really is. I suppose it's

\(\displaystyle f(x,\mu)=c_{\mu}x^{\mu-1}\exp(-(\text{ln}x)^2/2)\).

Give a sample \(\displaystyle (X_1,X_2...,X_n)\), and you've got the MLE, \(\displaystyle \mu_1=\frac{1}{n}\sum_{i=1}^{n}\text{ln}X_i\). For this \(\displaystyle f(x,\mu)\), that's right.

To test the unbiasness, you should calculate the expectation of \(\displaystyle \mu_1\).

Thus, we have, \(\displaystyle E(\mu_1)=\frac{1}{n}\sum_{i=1}^nE(\text{ln}X_i)\).

Noting \(\displaystyle E(\text{ln}X)=\int_0^{\infty}\text{ln}xf(x,\mu)dx=\frac{1}{\sqrt{2\pi}}\int_{\mathbb{R}}t\exp{(-(t-\mu)^2/2)}dt\), can you figure out this?
 
  • #3
stainburg said:
I'm not sure what \(\displaystyle f(x,\mu)\) really is. I suppose it's

\(\displaystyle f(x,\mu)=c_{\mu}x^{\mu-1}\exp(-(\text{ln}x)^2/2)\).

Give a sample \(\displaystyle (X_1,X_2...,X_n)\), and you've got the MLE, \(\displaystyle \mu_1=\frac{1}{n}\sum_{i=1}^{n}\text{ln}X_i\). For this \(\displaystyle f(x,\mu)\), that's right.

To test the unbiasness, you should calculate the expectation of \(\displaystyle \mu_1\).

Thus, we have, \(\displaystyle E(\mu_1)=\frac{1}{n}\sum_{i=1}^nE(\text{ln}X_i)\).

Noting \(\displaystyle E(\text{ln}X)=\int_0^{\infty}\text{ln}xf(x,\mu)dx=\frac{1}{\sqrt{2\pi}}\int_{\mathbb{R}}t\exp{(-(t-\mu)^2/2)}dt\), can you figure out this?

what substitution are you using?
 
  • #4
Fermat said:
what substitution are you using?
Let \(\displaystyle t=\text{ln}x\in (-\infty, \infty)\), hence \(\displaystyle x=\exp(t)\).

We then have

\(\displaystyle E(\text{ln}X)\\

=\int_0^{\infty}\text{ln}xc_{\mu}x^{\mu-1}\exp(-(\text{ln}x)^2/2)dx\\

=\frac{1}{\sqrt{2\pi}}\int_{\mathbb{R}}\exp(-\mu^2/2)t\exp{((\mu-1)t)}\exp{(-t^2/2)}d(\exp(t))\\

=\frac{1}{\sqrt{2\pi}}\int_{\mathbb{R}}t\exp{(-(t^2-2\mu t+\mu^2)/2)}dt\\

=\frac{1}{\sqrt{2\pi}}\int_{\mathbb{R}}t\exp{(-(t-\mu)^2/2)}dt\\\)
 
  • #5


Yes, recognizing the distribution of the $Y_{i}$ is an important step in showing the unbiasedness of the MLE. In this case, the $Y_{i}$ are normally distributed with mean $\mu$ and variance 1. Using the properties of the normal distribution, we can see that $E(Y_{i})=\mu$ and $Var(Y_{i})=1$. Therefore, $E({\mu}_{1})=\frac{1}{n}(n\mu)=\mu$, which shows that the MLE is unbiased. This means that on average, the MLE will give us the true value of $\mu$ as the estimate.
 

Related to Show that the maximum likelihood estimator is unbiased

1. What is the maximum likelihood estimator?

The maximum likelihood estimator is a statistical method used to estimate the parameters of a probability distribution by maximizing the likelihood function, which measures how likely it is for a set of data to occur given a specific model.

2. What does it mean for an estimator to be unbiased?

An estimator is unbiased if the expected value of the estimator is equal to the true value of the parameter being estimated. In other words, the estimator does not consistently overestimate or underestimate the true value.

3. How is the bias of an estimator determined?

The bias of an estimator can be determined by taking the expected value of the estimator and subtracting the true value of the parameter. If the result is equal to 0, the estimator is unbiased. If the result is positive, the estimator is biased towards overestimating the true value, and if the result is negative, the estimator is biased towards underestimating the true value.

4. How is unbiasedness shown for a maximum likelihood estimator?

To show that a maximum likelihood estimator is unbiased, we need to calculate its expected value and show that it is equal to the true value of the parameter. This can be done analytically or through simulation studies.

5. Why is unbiasedness important in statistical estimation?

Unbiasedness is important in statistical estimation because it ensures that the estimated values are closer to the true values, making the estimator more reliable and accurate. Biased estimators can lead to incorrect conclusions and unreliable results.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
981
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
19
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
761
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
837
Back
Top