Facebook Page
Twitter
RSS
+ Reply to Thread
Results 1 to 3 of 3

Thread: Likelihood

  1. MHB Apprentice

    Status
    Offline
    Join Date
    May 2016
    Posts
    5
    Thanks
    0 times
    Thanked
    0 times
    #1
    Assuming α is known, find the maximum likelihood estimator of β

    f(x;α,β) = , 1 ,,,,,,, .(xα.e-x/β)
    ,,,,,, ,,,,,,α!βα+1

    I know that firstly you must take the likelihood of L(β). But unsure if I have done it correctly. I came out with the answer below, please can someone tell me where/if I have gone wrong.

    L(β)= (α!βα+1)-n.Σxiα.eΣxi/βn

  2. MHB Craftsman
    MHB Math Helper

    Status
    Offline
    Join Date
    Jan 2012
    Posts
    488
    Thanks
    205 times
    Thanked
    575 times
    Thank/Post
    1.178
    #2
    I don't understand your question. The "maximum Likelihood" estimator for a parameter is the value of the parameter that makes a given outcome most likely. But you have not given an "outcome" here.

  3. MHB Craftsman

    Status
    Offline
    Join Date
    Jan 2012
    Location
    Belgium
    Posts
    150
    Thanks
    30 times
    Thanked
    205 times
    #3
    I think that you're going in the right direction. However, your calculation is not entirely correct. Suppose that we have given observations $x_1,\ldots,x_n$ from the given distribution. The likelihood is then given by
    $$\mathcal{L}(x_1,\ldots,x_n,\alpha,\beta) = \prod_{i=1}^{n} \frac{1}{\alpha ! \beta^{\alpha+1}} x_i^{\alpha}e^{-x_i/\beta}.$$
    We wish to find the value of $\beta$ that maximizes the likelihood. Since it is quite common to work with the logarithm, let us first take the log of both sides:
    $$\log \mathcal{L}(x_1,\ldots,x_n,\alpha,\beta) = -n \log(\alpha) - n (\alpha+1) \log(\beta)+ \alpha \sum_{i=1}^{n} \log(x_i) - \frac{\sum_{i=1}^{n} x_i}{\beta}.$$
    Taking the derivative w.r.t $\beta$, we obtain
    $$\frac{\partial \log \mathcal{L}(x_1,\ldots,x_n,\alpha,\beta)}{d\beta} = -n(\alpha+1)\frac{1}{\beta} - \frac{1}{\beta^2} \sum_{i=1}^{n} x_i.$$
    To proceed, set the RHS equal to $0$ and solve for $\beta$. This is the required MLE.

Similar Threads

  1. Maximum Likelihood Estimators
    By Julio in forum Advanced Probability and Statistics
    Replies: 3
    Last Post: September 24th, 2015, 05:10
  2. Maximum Likelihood Estimators
    By das in forum Advanced Probability and Statistics
    Replies: 1
    Last Post: August 4th, 2014, 09:05
  3. Show that the maximum likelihood estimator is unbiased
    By Fermat in forum Advanced Probability and Statistics
    Replies: 3
    Last Post: April 17th, 2014, 08:19
  4. Log likelihood function and EM algorithm
    By Jameson in forum Advanced Probability and Statistics
    Replies: 16
    Last Post: October 13th, 2013, 10:40

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Math Help Boards