# Merged Bernoulli process

##### New member
Consider two Bernoulli processes X1 and X2 such that
X1[k] is a Bernoulli random variable with P=0.5 and
X2[k] is a Bernoulli random variable with P=0.7 for all k>=0
Let Y be a random process formed by merging X1 and X2, i.e. Y[k] =1 if and only if X1[k] = X2[k] = 1 and Y[k] = 0 otherwise.

a.) Solve for the success probability of Y if X1 and X2 are uncorrelated.
b.) Solve for the success probability of Y if E[X1[k]X2[k]] = 0.3.
c.) If E[X1[k]X2[k]] is constant for all k, find the minimum possible success probability of Y.
d.) If E[X1[k]X2[k]] is constant for all k, find the maximum possible success probability of Y.

My initial thought was that the success probability of Y was just the intersection of x1 and x2 (when both are equal to 1) or 0.7*0.5 = 0.35.
However, the way to problem is written suggests that correlation affects the probability. Can someone explain how this is so?

Edit: I've realized that the 2 bernoulli processes may not be independent, therefore, my initial understanding of the problem is wrong.
However, how do I use the information given (the correlation) to solve for the probability? What equation relates the two together, or what assumptions can I make about the problem given the correlation?

Last edited:

#### steep

##### Member
Consider two Bernoulli processes X1 and X2 such that
X1[k] is a Bernoulli random variable with P=0.5 and
X2[k] is a Bernoulli random variable with P=0.7 for all k>=0
Let Y be a random process formed by merging X1 and X2, i.e. Y[k] =1 if and only if X1[k] = X2[k] = 1 and Y[k] = 0 otherwise.

a.) Solve for the success probability of Y if X1 and X2 are uncorrelated.
b.) Solve for the success probability of Y if E[X1[k]X2[k]] = 0.3.

This is pretty hard to read with all the multiple indices and no LaTeX to be honest. The standard form I've heard for 'merging' bernouli (or poisson) processes is additive, though this problem clearly is using products here, i.e. you have

$Y_k = X_k^{(1)}\cdot X_k^{(2)}$

so $Y_k$ takes on values 0 and 1 and is Bernouli

if $X_k^{(1)}$ and $X_k^{(2)}$ are uncorrelated, direct application of the definition tells you that

$E\big[X_k^{(1)}\cdot X_k^{(2)} \big] - E\big[X_k^{(1)} \big] \cdot \big[X_k^{(2)} \big] = 0$ i.e.
$E\big[X_k^{(1)}\cdot X_k^{(2)} \big] = E\big[X_k^{(1)} \big] \cdot \big[X_k^{(2)} \big]$

$p = E\big[Y_k\big] = E\big[X_k^{(1)}\cdot X_k^{(2)} \big] = E\big[X_k^{(1)} \big] \cdot \big[X_k^{(2)} \big] =0.5 \cdot 0.7$
because the probability of success is equal to the expected values of a bernouli

(b)'s answer seems to be given.

I'd like to see some work on your end for (c) and (d)

##### New member
for c and d,

I'm visualizing it to be two sets of 10 spaces each. One has 5 items within it, and the other has 7.
_ _ _ _ _ x x x x x
x x x x x x x _ _ _

and

x x x x x _ _ _ _ _
x x x x x x x _ _ _

It counts as a logic 1 or success if both corresponding spaces are filled.
The minimum would then be 0.7 - 0.5 = 0.2
and the maximum would be 0.5

#### steep

##### Member
for c and d,

I'm visualizing it to be two sets of 10 spaces each. One has 5 items within it, and the other has 7.
_ _ _ _ _ x x x x x
x x x x x x x _ _ _

and

x x x x x _ _ _ _ _
x x x x x x x _ _ _

It counts as a logic 1 or success if both corresponding spaces are filled.
The minimum would then be 0.7 - 0.5 = 0.2
and the maximum would be 0.5
I think a visualization like this is very good here and you have the right answer. For a different look at this problem, consider

re: maximization
The thing to keep in mind is probabilities are in [0,1], so they are at most one. For the max we have
$p = E\Big[Y_k\Big] = E\Big[X_k^{(1)}\cdot X_k^{(2)} \Big] = Pr\big(X_k^{(1)} = 1, X_k^{(2)} = 1\big) = Pr\big(X_k^{(1)} = 1\big) \cdot Pr\big( X_k^{(2)} = 1 \big \vert X_k^{(1)} = 1 \big)$
$\leq Pr\big(X_k^{(1)} = 1\big) \cdot 1 = 0.5$
which is what we get when we condition on the first random variable. Now conditioning on the second bernouli gives

$p = Pr\big(X_k^{(1)} = 1, X_k^{(2)} = 1\big) =Pr\big(X_k^{(2)} = 1, X_k^{(1)} = 1\big) = Pr\big(X_k^{(2)} = 1\big) \cdot Pr\big( X_k^{(1)} = 1 \big \vert X_k^{(2)} = 1 \big)$
$\leq Pr\big(X_k^{(2)} = 1\big) \cdot 1 = 0.7$

so putting these inequalities together we have
$p\leq 0.5 \leq 0.7$
and 0.5 is the answer as you've stated.

re: minimization
letting $A_k$ be the event that the first bernouli is 1 and $B_k$ be the event the second bernouli is one, we have

(this is the most basic form of inclusion-exclusion)
$P\Big(A_k \bigcup B_k\Big) = P\Big(A_k\Big) + P\Big(B_k\Big) - P\Big(A_k \bigcap B_k \Big)$
note this tells us

$P\Big(A_k \bigcup B_k\Big) \leq P\Big(A_k\Big) + P\Big(B_k\Big)$
(This is called the union bound)
because $0 \leq P\Big(A_k \bigcap B_k \Big) \leq 1$

and furthermore, since all probabilities are at most 1 we have

$P\Big(A_k \bigcup B_k\Big) \leq \min\Big(1, P\big(A_k\big) + P\big(B_k\big)\Big) = \min\Big(1, 1.2\Big) = 1$

so we revisit
$P\Big(A_k \bigcup B_k\Big) = P\Big(A_k\Big) + P\Big(B_k\Big) - P\Big(A_k \bigcap B_k \Big)$

now rearrange terms
$P\Big(A_k \bigcap B_k \Big) = P\Big(A_k\Big) + P\Big(B_k\Big) - P\Big(A_k \bigcup B_k\Big) = 0.7 + 0.5 - P\Big(A_k \bigcup B_k\Big) = 1.2 - P\Big(A_k \bigcup B_k\Big) \geq 1.2 - 1 = 0.2$
as you've stated