Distribution of Maximun Likelihood Estimator

In summary, the conversation discusses finding the maximum likelihood estimator (MLE) for b in a Pareto distribution and determining its distribution. The MLE is found to be b'=min Xi, and it is shown that for large n, the MLE follows a normal distribution. To show that the MLE follows a Pareto distribution, the hint suggests considering P(b'>x) and using the fact that b'>x if and only if X1>x, X2>x, etc. The distribution's cdf is also suggested to make the calculation easier.
  • #1
Tranquillity
51
0
Hey guys how are you? I have the following question:

Let X1,X2,...,Xn be a random sample from a Pareto distribution having pdf
f(x|b)= (a*b^a)/x^(a+1) where x>=b (1)

Determine the maximum likelihood estimator for b, say b' on (0,infinity) and by considering P(b'>x) or otherwise show that b' has the Pareto distribution with pdf given by (1) but with a replaced by an.


My attempt: I found the MLE as b'=min Xi where 1<=i<=n, since our pdf is monotonically increasing w.r.t b.

After that I know how to find the asymptotic distribution of the MLE using the formula including the expected information but then we say that MLE follows a normal distribution for large n.

How do I show that the MLE follows a Pareto distribution in this case? I am so struggled, any help would be much appreciated!

P.S The hint tells us to consider P(b'>x) but how can I find P(min Xi >x) and why should it help me?
 
Physics news on Phys.org
  • #2
Hint: min Xi > x if and only if X1>x and X2>x etc.

If you write the distribution's cdf instead of pdf it'll be much easier.
 
  • #3
Let say P(b'>x) = P(Xmin>x)= P(X1>x, X2>x,...,Xn>x)= P(X1>x) * P(X2>x) * ...*P(Xn>x)= [P(X1>x)]^n (since they are independent) But then what is P(X1>x)? Have I used correctly the independence?
 

Related to Distribution of Maximun Likelihood Estimator

1. What is a Maximum Likelihood Estimator (MLE)?

Maximum Likelihood Estimator (MLE) is a statistical method used to estimate the parameters of a probability distribution by finding the set of values that maximizes the likelihood function. It is commonly used in regression analysis and hypothesis testing.

2. How is MLE different from other estimation methods?

MLE is different from other estimation methods because it is based on the principle of maximum likelihood, which assumes that the observed data is the most likely to occur under the estimated parameters of the distribution. It also provides a measure of uncertainty, called the standard error, for the estimated parameters.

3. What are the assumptions of MLE?

The assumptions of MLE include:
1. The sample is randomly selected from the population.
2. The observations are independent from each other.
3. The sample size is sufficiently large.
4. The distribution of the data can be described by a parametric model.
5. The parameters of the model are identifiable.

4. How do you calculate the MLE?

The MLE is calculated by finding the values of the model parameters that maximize the likelihood function. This can be done analytically by taking the derivative of the likelihood function with respect to each parameter and setting it equal to 0. Alternatively, it can be done using numerical optimization methods.

5. What are the applications of MLE?

MLE has various applications in statistics, including:
1. Regression analysis: MLE is commonly used to estimate the parameters in linear and non-linear regression models.
2. Hypothesis testing: MLE is used to compare the likelihood of the observed data under different hypotheses.
3. Survival analysis: MLE is used to estimate the survival function in survival analysis.
4. Time series analysis: MLE is used to estimate parameters in time series models such as ARIMA.
5. Machine learning: MLE is used in various machine learning algorithms, such as maximum entropy models and Gaussian mixture models.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
19
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
985
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
16
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
Back
Top