Conditional probability and sum of rvs question

In summary, the problem is that X and Y are random variables and calculating their convolution is giving the researcher fits. The problem may not have an analytical solution and would need to be simulated.
  • #1
MrFancy
3
0
I'm trying to solve a problem as part of my research and it's giving me fits. It seems like it should be simple, but I can't wrap my brain around how to do it. The problem is:

Suppose X~N(0,s), and Y is a random variable that has a probability mass point at 0 but is otherwise uniformally distributed on (0,t] so that:

f(y)=k, y=0
f(y)=(1-k)(1/t), 0 < y < t
f(y)=0 otherwise

What is

Pr(y < A | x + y > B)

where A and B are arbitrary constants?

I think I've calculated the convolution of X and Y, but I'm not sure how to get the density from there (and I'm not sure I have the convolution right either). Thanks for any help you can provide.
 
Physics news on Phys.org
  • #2
Does this look like something that isn't going to have an analytical solution and would need to be simulated?
 
  • #3
Try using the law of total probability to get

[tex] Pr(Y<a|X+Y>b)=\int_{-\infty}^{\infty} Pr(Y<a|Y>b-x) Pr(X=x) dx [/tex]

Sorry for the sloppy notation, but if you change the limits of the integral, I think you should be able to compute the answer.
 
  • #4
OK, thanks a lot, I hadn't thought of using the law of total probability.

Here's what I did, simplifying a little so that Y has a mass point at 0 and is otherwise U[0,1] instead of U[0,t]:

For the Pr(Y<a|Y>b-x) part,
1. take the cdf of Y evaluated at a, which is a(1-k)+k
2. subtract the cdf of Y evaluated at b-x, which is (b-x)(1-k)+k
3. divide by (1- the cdf of Y evaluated at b-x), which is 1-((b-x)(1-k)+k)

The bounds of integration should be b-a to positive infinity, since this probability is 0 if x<(b-a). Then the Pr(X=x) part is just the pdf of a normal random variable. This gives me:

[tex] Pr(Y<a|X+Y>b)=\int_{b-a}^{\infty} (1-k)\frac{(a-b+x)}{1-(b-x)(1-k)-k}\frac{1}{\sigma\sqrt{2\pi}}exp(-\frac{e^2}{2\sigma^2}) dx [/tex]

Does that look right? If so, isn't that integral too messy to evaluate?
 
  • #5
MrFancy said:
OK, thanks a lot, I hadn't thought of using the law of total probability.

Here's what I did, simplifying a little so that Y has a mass point at 0 and is otherwise U[0,1] instead of U[0,t]:

For the Pr(Y<a|Y>b-x) part,
1. take the cdf of Y evaluated at a, which is a(1-k)+k
2. subtract the cdf of Y evaluated at b-x, which is (b-x)(1-k)+k
3. divide by (1- the cdf of Y evaluated at b-x), which is 1-((b-x)(1-k)+k)

The bounds of integration should be b-a to positive infinity, since this probability is 0 if x<(b-a). Then the Pr(X=x) part is just the pdf of a normal random variable. This gives me:

[tex] Pr(Y<a|X+Y>b)=\int_{b-a}^{\infty} (1-k)\frac{(a-b+x)}{1-(b-x)(1-k)-k}\frac{1}{\sigma\sqrt{2\pi}}exp(-\frac{e^2}{2\sigma^2}) dx [/tex]

Does that look right? If so, isn't that integral too messy to evaluate?

What you did looks correct except that it should be exp (x...) but i think that was a typo. I don't know if you can integrate that or not, certainly you can approximate it using numerical methods. If you want to solve the integral I suggest using a software or asking someone else as I suck at calculus and woke up quite early this morning. Good luck!
 

Related to Conditional probability and sum of rvs question

1. What is conditional probability?

Conditional probability is the likelihood of an event occurring given that another event has already occurred. It is calculated by dividing the probability of both events occurring together by the probability of the first event occurring alone.

2. How is conditional probability different from regular probability?

Regular probability calculates the likelihood of an event occurring without any other prior event. Conditional probability takes into account a prior event and adjusts the probability accordingly.

3. Can you give an example of conditional probability?

One example of conditional probability is the likelihood of getting a head on a coin toss, given that the previous toss was a tail. The probability would be 1/2, since the previous toss does not affect the outcome of the next toss.

4. What is the sum of random variables (rvs)?

The sum of random variables refers to the sum of multiple random variables. It is used to calculate the probability of a certain outcome by adding the probabilities of all the random variables involved.

5. How is the sum of rvs question related to conditional probability?

The sum of rvs question involves calculating the probability of a certain outcome based on multiple random variables. This often involves using conditional probability to adjust the probabilities based on prior events.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
897
  • Set Theory, Logic, Probability, Statistics
Replies
12
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
10
Views
997
  • Set Theory, Logic, Probability, Statistics
Replies
17
Views
903
  • Set Theory, Logic, Probability, Statistics
2
Replies
36
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
763
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
Back
Top