Welcome to our community

Be a part of something great, join today!

Prove that lim y->infinity F(X,Y) (x,y) = F(X)(x)

oyth94

Member
Jun 2, 2013
33
Prove that lim y->infinity F(X,Y) (x,y) = F(X)(x)
 

TheBigBadBen

Active member
May 12, 2013
84
re: Prove that lim y->infinity F(X,Y) (x,y) = F(X)(x)

Prove that lim y->infinity F(X,Y) (x,y) = F(X)(x)
What is F(X,Y) here? Is this the cumulative distribution function?

Are we given any information about X and Y in the problem?
 

oyth94

Member
Jun 2, 2013
33
What is F(X,Y) here? Is this the cumulative distribution function?

Are we given any information about X and Y in the problem?
I think this is the joint cdf, This question is related to joint probability in some way.

i did lim y->infinity FX,Y(x,y) = P(X<= x, Y <= infinity)
after that i am not sure if i skipped a step or went the wrong direction but the next step i did was conclude that
= P(X<=x)
= F(X)(x)

please help!!
 
Last edited:

chisigma

Well-known member
Feb 13, 2012
1,704
Prove that lim y->infinity F(X,Y) (x,y) = F(X)(x)
By definition is...

$$F_{X,Y} (x,y) = P \{X<x,Y<y\}$$

... and because if y tends to infinity then $P \{Y<y\}$ tends to 1 we have...

$$\lim_{y \rightarrow \infty} F_{X,Y} (x,y) = P \{X<x\} = F_{X} (x)$$

Kind regards

$\chi$ $\sigma$
 

oyth94

Member
Jun 2, 2013
33
By definition is...

$$F_{X,Y} (x,y) = P \{X<x,Y<y\}$$

... and because if y tends to infinity then $P \{Y<y\}$ tends to 1 we have...

$$\lim_{y \rightarrow \infty} F_{X,Y} (x,y) = P \{X<x\} = F_{X} (x)$$

Kind regards

$\chi$ $\sigma$
So I did it correctly? No steps skipped?
 

TheBigBadBen

Active member
May 12, 2013
84
So I did it correctly? No steps skipped?
I would think that unless your professor wants you to prove this using $\epsilon$s and $\delta$s, you've said as much as you need to say. However, the best way to know is to ask the person who will be grading your homework.
 

oyth94

Member
Jun 2, 2013
33
There is a similar question. But this one is not a limit question but does invoke joint probability, independence etc
The question is:

F(X,Y(x,y) <= FX(x),FY(y)

I know that when finding the integral for FX(x) it is in respect to y. And when finding the integral for FY(y) we find integral with respect to x. When we multiply the two together to find if it is independent we multiply the two together and see if it equals to FX,Y(x,y)
But I'm not sure if this question is regarding independence or something else. How must I go about proving this question?
 

TheBigBadBen

Active member
May 12, 2013
84
There is a similar question. But this one is not a limit question but does invoke joint probability, independence etc
The question is:

F(X,Y(x,y) <= FX(x),FY(y)

I know that when finding the integral for FX(x) it is in respect to y. And when finding the integral for FY(y) we find integral with respect to x. When we multiply the two together to find if it is independent we multiply the two together and see if it equals to FX,Y(x,y)
But I'm not sure if this question is regarding independence or something else. How must I go about proving this question?
I don't think that the premise of the question is true in general, you would have provide more information about the distribution on $X$ and $Y$.
 

oyth94

Member
Jun 2, 2013
33
I don't think that the premise of the question is true in general, you would have provide more information about the distribution on $X$ and $Y$.
This was all that was given in the question. So I am confused now... Or can we prove by contradiction if possible?
 

TheBigBadBen

Active member
May 12, 2013
84
This was all that was given in the question. So I am confused now... Or can we prove by contradiction if possible?
Sorry, I was a little hasty with that; let me actually be sure about what the question states. My understanding is that you are to prove that for any $X,Y$ with some joint cdf $F_{X,Y}$ and where $X$ and $Y$ are not necessarily independent, we can state that
$$
F_{X,Y}(x,y)\leq F_X(x)\cdot F_Y(y)
$$
Or, phrased in a different way:
$$
P(X< x \text{ and } Y< y) \leq P(X < x)\cdot P(Y < y)
$$

If the above is what you meant, I would pose the following counterargument: we could equivalently state
$$
P(X< x|Y<y) \cdot P(Y< y) \leq P(X < x)\cdot P(Y < y) \Rightarrow \\
P(X<x|Y<y) \leq P(X<x)
$$
and that simply isn't true for all distributions. That is, prior knowledge of another variable can increase the probability of an event. Would you like a counter-example to this claim?

If that's not what you meant, or if there's more information about $X$ and $Y$ that you're leaving out, do say so.
 

TheBigBadBen

Active member
May 12, 2013
84
I realize what you conceivably could have meant (and probably did mean) is

$$
\text{for any }y:\; F_{X,Y}(x,y)\leq F_X(x) \text{ AND }\\
\text{for any }x:\; F_{X,Y}(x,y)\leq F_Y(y)
$$

Is this what you meant to prove? Then yes, we can prove this by integration, as you rightly mentioned.

Please, please, please: try to be clearer in the future about what you mean, even if it makes your post a little longer.
 

oyth94

Member
Jun 2, 2013
33
I realize what you conceivably could have meant (and probably did mean) is

$$
\text{for any }y:\; F_{X,Y}(x,y)\leq F_X(x) \text{ AND }\\
\text{for any }x:\; F_{X,Y}(x,y)\leq F_Y(y)
$$

Is this what you meant to prove? Then yes, we can prove this by integration, as you rightly mentioned.

Please, please, please: try to be clearer in the future about what you mean, even if it makes your post a little longer.
Hi my apologies this is actually what I meant to say. So how does it work after integration? An I doing: the Integral from 0 to y for FX(x)dy multiply with integral from 0 to x of FY(y)dx to get integral of FXY(x,y)? Okay something is wrong I don't think that makes sense does it?
 

oyth94

Member
Jun 2, 2013
33
Re: Prove that lim y-&gt;infinity F(X,Y) (x,y) = F(X)(x)

Sorry, I was a little hasty with that; let me actually be sure about what the question states. My understanding is that you are to prove that for any $X,Y$ with some joint cdf $F_{X,Y}$ and where $X$ and $Y$ are not necessarily independent, we can state that
$$
F_{X,Y}(x,y)\leq F_X(x)\cdot F_Y(y)
$$
Or, phrased in a different way:
$$
P(X< x \text{ and } Y< y) \leq P(X < x)\cdot P(Y < y)
$$

If the above is what you meant, I would pose the following counterargument: we could equivalently state
$$
P(X< x|Y<y) \cdot P(Y< y) \leq P(X < x)\cdot P(Y < y) \Rightarrow \\
P(X<x|Y<y) \leq P(X<x)
$$
and that simply isn't true for all distributions. That is, prior knowledge of another variable can increase the probability of an event. Would you like a counter-example to this claim?

If that's not what you meant, or if there's more information about $X$ and $Y$ that you're leaving out, do say so.
For the counter argument why did you use conditional probability? And multiplied by p(Y<y)?

- - - Updated - - -

For the counter argument why did you use conditional probability? And multiplied by p(Y<y)?
Oh sorry never mind I understand why you used the conditional probability multiply with P(Y<y) because it is te intersection of X<x and Y<y. So that is the counter argument used against this proof?
 

TheBigBadBen

Active member
May 12, 2013
84
Hi my apologies this is actually what I meant to say. So how does it work after integration? An I doing: the Integral from 0 to y for FX(x)dy multiply with integral from 0 to x of FY(y)dx to get integral of FXY(x,y)? Okay something is wrong I don't think that makes sense does it?
So the proof of the first inequality via integrals would go something like this:
First of all, definition. We state that there is some (joint) probability density function of the form $f_{X,Y}(x,y)$. We can then supply the following definitions of $F_X,F_Y,$ and $F_{X,Y}$ in terms of integrals:
$$
F_{X,Y}(x,y)=\int_{-\infty}^y \int_{-\infty}^x f_{X,Y}(x,y)\,dx\,dy\\
F_{X}(x)=\int_{-\infty}^{\infty} \int_{-\infty}^x f_{X,Y}(x,y)\,dx\,dy\\
F_{Y}(y)=\int_{-\infty}^{y} \int_{-\infty}^\infty f_{X,Y}(x,y)\,dx\,dy
$$

With that in mind, we may state that
$$
F_{X}(x)=\int_{-\infty}^{\infty} \int_{-\infty}^x f_{X,Y}(x,y)\,dx\,dy\\
= \int_{-\infty}^{y} \int_{-\infty}^x f_{X,Y}(x,y)\,dx\,dy +
\int_{y}^{\infty} \int_{-\infty}^x f_{X,Y}(x,y)\,dx\,dy
$$
Since $f_{X,Y}(x,y)$ is a probability distribution, it is non-negative at all $(x,y)$, which means that both of the above integrals are non-negative. This allows us to state that
$$
\int_{-\infty}^{y} \int_{-\infty}^x f_{X,Y}(x,y)\,dx\,dy +
\int_{y}^{\infty} \int_{-\infty}^x f_{X,Y}(x,y)\,dx\,dy \geq \\
\int_{-\infty}^{y} \int_{-\infty}^x f_{X,Y}(x,y)\,dx\,dy = F_{X,Y}(x,y)
$$
Thus, we may conclude that in general, $F_{X,Y}(x,y) \leq F_Y{y}$.

However, it is not necessary to appeal to these integral definitions in order to go through the proof. I'll post an alternate, simpler proof as well
 

TheBigBadBen

Active member
May 12, 2013
84
The Easier Proof (via set theory)

Consider the following definitions:
$$
F_{X,Y}(x,y)=P(X< x \text{ and } Y < y)\\
F_{X}(x)=P(X< x)\\
F_{Y}(y)=P(Y< y)
$$

We note that the set of $(x,y)$ such that $X< x \text{ and } Y < y$ is the intersection of the set of $(x,y)$ such that $X< x$ and the set of $(x,y)$ such that $Y < y$. Since this set is a subset of each set, the probability of the associated event is less than or equal to the probability of either of the other events.