# Strange notation or typo?

#### Jameson

Staff member
Problem: Show that if $P(A_i)=1$ for all $i \ge 1$ then $P(\bigcap_{i=1}^{\infty}A_i)=1$.
[HR][/HR]
What is strange about this question is the first part, $P(A_i)=1$ for all $i \ge 1$. If I'm understanding this correctly that's saying that $P(A_1)=1$, $P(A_2)=1$...$P(A_n)=1$, for $n \ge i \ge 1$. This is only true if $A_1=A_2=...A_n$ because the sum of their probabilities (keeping in mind inclusion-exlusion of course) cannot be larger than 1. It's obvious that if this is the case that the intersections of all of them is 1 as well, but that's not the part that troubles me.

So is this a typo or am I misunderstanding the problem do you think?

Last edited:

#### MarkFL

Staff member
I agree both with your interpretation and with the fact that the result is seemingly obvious. I think I would state:

$P(\bigcap_{i=1}^{n}A_i)=n-(n-1)=1$

#### Jameson

Staff member
I'm not familiar with that identity. Does it have a name or can you briefly explain where it comes from?

#### MarkFL

Staff member
I don't know what it's called, and perhaps it is an over-simplification, I was basically using:

$\displaystyle \sum_{i=1}^{n}P(A_i)=n$

and:

$\displaystyle \sum_{i=1}^{n-1}P(A_i \cap A_{i+1})=n-1$

This only pairs subsequent events in the sequence, and probably is invalid for that reason.

I think a better method would be to use the method for counting intersections outlined here:

Inclusion

#### Evgeny.Makarov

##### Well-known member
MHB Math Scholar
Problem: Show that if $P(A_i)=1$ for all $i \ge 1$ then $P(\bigcap_{i=1}^{\infty}A_i)=1$.
[HR][/HR]
What is strange about this question is the first part, $P(A_i)=1$ for all $i \ge 1$. If I'm understanding this correctly that's saying that $P(A_1)=1$, $P(A_2)=1$...$P(A_n)=1$, for $n \ge i \ge 1$. This is only true if $A_1=A_2=...A_n$
Let $\mathbb{R}\supset A_i=[0,1]\setminus\{1/n\}$ and let $P(A)$ be the measure (length) of $A$. Then $P(A_i)=1$ but $A_i\ne A_j$ for $i\ne j$.

#### Klaas van Aarsen

##### MHB Seeker
Staff member
Problem: Show that if $P(A_i)=1$ for all $i \ge 1$ then $P(\bigcap_{i=1}^{\infty}A_i)=1$.
[HR][/HR]
What is strange about this question is the first part, $P(A_i)=1$ for all $i \ge 1$. If I'm understanding this correctly that's saying that $P(A_1)=1$, $P(A_2)=1$...$P(A_n)=1$, for $n \ge i \ge 1$. This is only true if $A_1=A_2=...A_n$ because the sum of their probabilities (keeping in mind inclusion-exlusion of course) cannot be larger than 1. It's obvious that if this is the case that the intersections of all of them is 1 as well, but that's not the part that troubles me.

So is this a typo or am I misunderstanding the problem do you think?
Hi Jameson!

A proof should be based on the axioms and propositions of probability theory.
See wiki.

Let $B_n=\displaystyle\bigcap_{i=1}^{n}A_i$.

Then $P(\displaystyle \bigcap_{i=1}^{\infty}A_i)=\displaystyle \lim_{n \to \infty}P(B_n)$.

According to the sum rule, we have:
$P(B_n \cup A_{n+1})=P(B_n) + P(A_{n+1}) - P(B_n \cap A_{n+1})$​

According to the monotonicity rule and the numeric bound rule we also have:
$1 = P(A_{n+1}) \le P(B_n \cup A_{n+1}) \le 1$​

It follows that:
$1=P(B_n) + 1 - P(B_n \cap A_{n+1})$

$P(B_{n+1}) = P(B_n \cap A_{n+1}) = P(B_n)$​

With induction it follows that $P(\displaystyle \bigcap_{i=1}^{\infty}A_i)=P(A_1)=1$. $\qquad \blacksquare$

#### Jameson

Staff member

1) I really need a class on measure theory and probably set theory as well to understand this problem and similar ones on a deeper level. I'm trying to build something with improper tools.

2) My conclusion that this problem implies that $$\displaystyle A_1=A_2=...A_n$$ is incorrect, although I am still processing the details of why.

@ILikeSerena - That is the idea that my professor was hinting towards, although this isn't a proof based class so he doesn't expect that kind of rigor. However, I am trying to attempt formal proofs where possible so I will review yours and post back if I don't follow something. From a short glance at it though, I think I follow each step.

Thanks again to all who have replied!

#### Klaas van Aarsen

##### MHB Seeker
Staff member
@ILikeSerena - That is the idea that my professor was hinting towards, although this isn't a proof based class so he doesn't expect that kind of rigor. However, I am trying to attempt formal proofs where possible so I will review yours and post back if I don't follow something. From a short glance at it though, I think I follow each step.
I did compress the proof a little and skipped a couple of steps, since I mostly wanted to highlight what was probably intended.
Let me know if you need any explanation.

#### Klaas van Aarsen

##### MHB Seeker
Staff member
Btw, your "mistake" was the assumption that P(A)=1 implies that A is the set of all possible outcomes.
As Evgeny.Makarov showed, this is not necessarily the case.
It is true that if A is the set of all possible outcomes, that then P(A)=1.

#### Klaas van Aarsen

##### MHB Seeker
Staff member
I just thunked up a more specific and perhaps intuitive example.

Suppose we roll a dice with 6 sides.
Let $A_i$ be the event that the result is less than 6+i.
That is, $A_i$ is the set of outcomes {1,2,3,...,6+i-1}.
Then the events are not identical, but the chance on any of them is still 1.
Furthermore, the probability of an outcome in their intersection is also 1.

#### rashtastic

##### New member
Does this look okay?

$P(\cap{A_{i}})<1\Rightarrow 1-P(\cap{A_{i}})=P((\cap{A_i})^{C})=P(\cup{A_{i}^{C}})>0.$

$P(\cup{A_{i}^{C}})>0\Rightarrow\exists j\in{\mathbb{N}}$ with $P(A_j^{C})>0$. (Consider that $P(\cup{A_{i}^{C}})\leq\sum{P(A_{i}^{C})}$)

Then $P(A_{j})=1-P(A_{j}^{C})<1$, contradicting our premise.

I'd like to find a way to show mutual independence without using induction.

#### Klaas van Aarsen

##### MHB Seeker
Staff member
Does this look okay?

$P(\cap{A_{i}})<1\Rightarrow 1-P(\cap{A_{i}})=P((\cap{A_i})^{C})=P(\cup{A_{i}^{C}})>0.$

$P(\cup{A_{i}^{C}})>0\Rightarrow\exists j\in{\mathbb{N}}$ with $P(A_j^{C})>0$. (Consider that $P(\cup{A_{i}^{C}})\leq\sum{P(A_{i}^{C})}$)

Then $P(A_{j})=1-P(A_{j}^{C})<1$, contradicting our premise.

I'd like to find a way to show mutual independence without using induction.
Looks fine to me.

Suppose $P(\cap{A_{i}}) \ne 1$, then:​

That way, you give a proper introduction to a proof by contradiction, since you'd be specifying the premise you're contradicting.

#### rashtastic

##### New member
Looks fine to me.

Suppose $P(\cap{A_{i}}) \ne 1$, then:​

That way, you give a proper introduction to a proof by contradiction, since you'd be specifying the premise you're contradicting.
Thanks, ILikeSerena. Here's another attempt...

Take any subset $\Lambda$ from $\{A_{1}^{C}, A_{2}^{C}, A_{3}^{C}, ...\}$. We can label this subset $\Lambda=\{A_{j_{1}}^{C}, A_{j_{2}}^{C}, A_{j_{3}}^{C}...\}.$ Consider $P(\bigcap_{\lambda\in\Lambda}\lambda)$.

$P(\bigcap_{\lambda\in\Lambda}\lambda)=P(A_{j_{1}}^{C}\cap A_{j_{2}}^{C}\cap A_{j_{3}}^{C}\cap...)=P(A_{j_{1}}^{C})P(A_{j_{2}}^{C}\cap A_{j_{3}}^{C}\cap...|A_{j_{1}}^{C})$

$=(1-P(A_{j_{1}}))P(A_{j_{2}}^{C}\cap A_{j_{3}}^{C}\cap...|A_{j_{1}}^{C})=0=\Pi_{\lambda\in\Lambda}{P(\lambda)}.$

Because our choice of $\Lambda$ is arbitrary, the elements of $\{A_{1}^{C}, A_{2}^{C}, A_{3}^{C}, ...\}$ are mutually independent. This implies that (*needs citation!) $A_{1}, A_{2}, A_{3}, ...$ are also mutually independent, so that $P(\cap{A_{i}})=\prod{P(A_{i})}=1.$

#### Klaas van Aarsen

##### MHB Seeker
Staff member
Thanks, ILikeSerena. Here's another attempt...

Take any subset $\Lambda$ from $\{A_{1}^{C}, A_{2}^{C}, A_{3}^{C}, ...\}$. We can label this subset $\Lambda=\{A_{j_{1}}^{C}, A_{j_{2}}^{C}, A_{j_{3}}^{C}...\}.$ Consider $P(\bigcap_{\lambda\in\Lambda}\lambda)$.

$P(\bigcap_{\lambda\in\Lambda}\lambda)=P(A_{j_{1}}^{C}\cap A_{j_{2}}^{C}\cap A_{j_{3}}^{C}\cap...)=P(A_{j_{1}}^{C})P(A_{j_{2}}^{C}\cap A_{j_{3}}^{C}\cap...|A_{j_{1}}^{C})$

$=(1-P(A_{j_{1}}))P(A_{j_{2}}^{C}\cap A_{j_{3}}^{C}\cap...|A_{j_{1}}^{C})=0=\Pi_{\lambda\in\Lambda}{P(\lambda)}.$

Because our choice of $\Lambda$ is arbitrary, the elements of $\{A_{1}^{C}, A_{2}^{C}, A_{3}^{C}, ...\}$ are mutually independent. This implies that (*needs citation!) $A_{1}, A_{2}, A_{3}, ...$ are also mutually independent, so that $P(\cap{A_{i}})=\prod{P(A_{i})}=1.$
Ah well, I've given up on trying to verify if it is correct.
Since we already have 2 proofs... any reason to introduce a new one that is more obscure and that makes leaps that really take too much time to verify properly?

I am more the type of guy that prefers proofs that leap to the mind instantly, being obvious in their simplicity.

#### rashtastic

##### New member
Ah well, I've given up on trying to verify if it is correct.
Since we already have 2 proofs... any reason to introduce a new one that is more obscure and that makes leaps that really take too much time to verify properly?

I am more the type of guy that prefers proofs that leap to the mind instantly, being obvious in their simplicity.
The goal for me is not to verify the problem as many times as possible but to learn some probability from it. Mutual independence is a completely different argument, a stronger result, and, I think, a more direct method, but I don't know if the argument works. Therefore I have something to learn, even if that thing is "This is a terrible proof."

#### awkward

##### Member
Here is a proof that is perhaps more directly grounded in the axioms of a probability space. (See Probability space - Wikipedia, the free encyclopedia)

$\Pr[\cap_{i=1}^{\infty} A_i]$

$= 1 - \Pr[(\cap_{i=1}^{\infty} A_i)^c]$

$= 1 - \Pr[\cup_{i=1}^{\infty} A_i^c]$

$\ge 1 - \sum_{i=1}^{\infty} \Pr(A_i^c)$

$= 1 - \sum_{i=1}^{\infty} 0$

$= 1-0$

$= 1$