Welcome to our community

Be a part of something great, join today!

Joint density problem.

Jason

New member
Feb 5, 2012
28
I have:

$f_A=\lambda e^{-\lambda a}$

$f_B=\mu e^{-\mu b}$

I need to find the density for $C=\min(A,B)$

($A$ and $B$ are independent).

Is this correct or utterly wrong?

$f_C(c)=f_A(c)+f_B(c)-f_A(c)F_B(c)-F_A(c)f_B(c)$

$=\lambda e^{-\lambda c}+\mu e^{-\mu c}-\lambda e^{-\lambda c}(1-e^{-\mu c})-(1-e^{-\lambda c})\mu e^{-\mu c}$

$=\lambda e^{-\lambda c}e^{-\mu c}+\mu e^{-\lambda c}e^{-\mu c}$

$=2(\lambda+\mu)e^{-c(\lambda+\mu)}$
 
Last edited:

CaptainBlack

Well-known member
Jan 26, 2012
890
I have:

$f_A=\lambda e^{-\lambda a}$

$f_B=\mu e^{-\mu b}$

I need to find the density for $C=\min(A,B)$

($A$ and $B$ are independent).

Is this correct or utterly wrong?

$f_C(c)=f_A(c)+f_B(c)-f_A(c)F_B(c)-F_A(c)f_B(c)$
You need to explain where this comes from.

Because we have two cases; \( A<B\) and \(A\ge B\) I would start:

$ \large f_C(c)=f_A(c)Pr(B>c|A=c)+Pr(A>c|B=c)) $

then independence reduces this to:

$ \large f_C(c)=f_A(c)Pr(B>c)+f_B(c)Pr(A>C) $

so:

\( \large f_C(c)=f_A(c)(1-F_B(c))+f_B(c)(1-F_A(c) \))

CB
 
Last edited: