If X and Y are independent, are X^k and Y?

  • Thread starter AxiomOfChoice
  • Start date
  • Tags
    Independent
In summary: The proof involves Dynkin's pi-lambda theorem.In summary, the conversation discusses the concept of independence of random variables from a measure-theoretic standpoint. It is mentioned that if X and Y are independent, then the expectations of any functions of X and Y are also independent. The conversation also discusses the proof of this theorem and references a book for further explanation.
  • #1
AxiomOfChoice
533
1
The definition of independence of random variables from a measure-theoretic standpoint is so confusing (independence of generated sigma-algebras, etc.) that I cannot answer this seemingly simple question...So if [itex]X,Y[/itex] are independent random variables, does that mean [itex]X,X^2,X^3,X^4,\ldots[/itex] and [itex]Y,Y^2,Y^3,\ldots[/itex] are each pairwise independent?
 
Physics news on Phys.org
  • #2
Okay...just looked this up in a book. In fact, if [itex]X,Y[/itex] are independent, then we have [itex]E[f(X)g(Y)] = E[f(X)]E[g(Y)][/itex] for "any" functions f and g. But the proof is from a non-measure-theoretic probability book. Is there anyone who can explain why this holds from a measure-theoretic standpoint?
 
  • #3
Expectations can be expressed as integrals involving the probability density functions. Since X and Y are independent, their joint density is simply the product of their individual densities, so the expectations involving functions of the random variables end up as the product of the individual expectations.
 
  • #4
AxiomOfChoice said:
Okay...just looked this up in a book. In fact, if [itex]X,Y[/itex] are independent, then we have [itex]E[f(X)g(Y)] = E[f(X)]E[g(Y)][/itex] for "any" functions f and g.

I don't know what that result has to do with your original question. The conclusion deals with the expectations of f(x) and g(y) not with their independence. Random variables can be uncorrelated and still be dependent.

I'm not an expert on measure theory but I did take the course years ago. I think answering your original post ( which concerns functions of a random variable) in detail is complicated. For example, not all functions are "measureable". The one's you listed are. How do we prove they are? Are you wanting an explanation that begins at elementary points like that? Or do you simply want a theorem from measure theory that answers your question as a special case?
 
  • #5
It's probably easier to use P[X<=x,Y<=y] = P[X<=x]P[Y<=y] (which can be obtained from the measure-theoretic definition of independence by considering generators of the Borel sigma algebras for R and R^2).
 
  • #6
Let X and Y be independent random variable and f and g be borel-measurable functions on R, there is a measure-theoretic way of proof that f(X) and g(Y) are independent r.v. in Shreve's book "Stochastic Calculus for Finance II", see Theorem 2.2.5
 

Related to If X and Y are independent, are X^k and Y?

1. What does it mean for two variables to be "independent"?

Independent variables are those that do not affect each other. In other words, the value of one variable does not change based on the value of the other variable.

2. How can we determine if two variables are independent?

To determine if two variables are independent, we can calculate their covariance. If the covariance is 0, then the variables are independent. If the covariance is not 0, then the variables are dependent.

3. What is the significance of X^k in relation to independence?

X^k, or X raised to the power of k, is still considered a single variable. This means that if X and Y are independent, X^k and Y are also independent.

4. Can X^k and Y be independent if X and Y are not?

Yes, it is possible for X^k and Y to be independent even if X and Y are not. This is because raising a variable to a power can change the relationship between the variables.

5. How does the independence of X^k and Y affect statistical analysis?

If X^k and Y are independent, it means that we can treat them as separate variables in statistical analysis. This allows us to make more accurate predictions and draw meaningful conclusions from our data.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
10
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
878
  • Set Theory, Logic, Probability, Statistics
Replies
12
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
1K
  • Set Theory, Logic, Probability, Statistics
2
Replies
64
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
2K
Replies
0
Views
516
Back
Top