Proving conditional expectation

In summary, the conversation discusses the equality involving two random variables U and X, where E(U|X) = E(U) = 0. It is mentioned that this assumption implies E(U^2|X) = E(U^2), but there is confusion on how to prove it. The conversation then moves on to discussing the homoskedasticity assumption for simple linear regression and how it relates to the equality.
  • #1
Usagi
45
0
Hi guys, assume we have an equality involving 2 random variables U and X such that [tex]E(U|X) = E(U)=0[/tex], now I was told that this assumption implies that [tex]E(U^2|X) = E(U^2)[/tex]. However I'm not sure on how to prove this, if anyone could show me that'd be great!
 
Mathematics news on Phys.org
  • #2
Usagi said:
Hi guys, assume we have an equality involving 2 random variables U and X such that [tex]E(U|X) = E(U)=0[/tex], now I was told that this assumption implies that [tex]E(U^2|X) = E(U^2)[/tex]. However I'm not sure on how to prove this, if anyone could show me that'd be great!

Not sure this is true. Suppose \(U|(X=x) \sim N(0,x^2)\), and \(X\) has whatever distribution we like.

Then \(E(U|X=x)=0\) and \( \displaystyle E(U)=\int \int u f_{U|X=x}(u) f_X(x)\;dudx =\int E(U|X=x) f_X(x) \; dx=0\).

Now \(E(U^2|X=x)={\text{Var}}(U|X=x)=x^2\). While \( \displaystyle E(U^2)=\int E(U^2|X=x) f_X(x) \; dx= \int x^2 f_X(x) \; dx\).

Or have I misunderstood something?

CB
 
  • #3
Hi CB,

Actually the problem arose from the following passage regarding the homoskedasticity assumption for simple linear regression:

http://img444.imageshack.us/img444/6892/asdfsdfc.jpg
I do not understand how they came to the conclusion that [tex]\sigma^2 = E(u^2|x) \implies \sigma^2 = E(u^2)[/tex]

Thanks for your help!
 
  • #4
Usagi said:
Hi CB,

Actually the problem arose from the following passage regarding the homoskedasticity assumption for simple linear regression:

http://img444.imageshack.us/img444/6892/asdfsdfc.jpg
I do not understand how they came to the conclusion that [tex]\sigma^2 = E(u^2|x) \implies \sigma^2 = E(u^2)[/tex]

Thanks for your help!

It is the assumed homoskedasticity (that is what it means).

CB
 
  • #5


I can provide a proof for the equality E(U^2|X) = E(U^2) given the assumption E(U|X) = E(U)=0.

First, we know that the conditional expectation of a random variable U given another random variable X is defined as E(U|X) = ∫ U(x) f(x) dx, where f(x) is the conditional probability density function of X.

Using this definition, we can rewrite E(U^2|X) as ∫ U^2(x) f(x) dx. Similarly, E(U^2) can be written as ∫ U^2(x) g(x) dx, where g(x) is the probability density function of U.

Now, since we are given the assumption E(U|X) = E(U)=0, this implies that the expected value of U is equal to 0 for all values of X. In other words, the probability density function of U, g(x), is a constant function with a value of 0.

Substituting this into our equation for E(U^2), we get ∫ U^2(x) (0) dx = 0, which means that E(U^2) = 0.

Using the same logic, we can see that for E(U^2|X), the integral will also evaluate to 0 since U(x) is multiplied by the same constant function of 0. Therefore, E(U^2|X) = 0.

Therefore, we have proven that E(U^2|X) = E(U^2) given the assumption E(U|X) = E(U)=0.
 

Related to Proving conditional expectation

1. What is conditional expectation?

Conditional expectation is a concept in probability theory that refers to the expected value of a random variable given the knowledge of another random variable. It represents the average value of the first random variable, taking into account the information provided by the second random variable.

2. How is conditional expectation calculated?

The formula for calculating conditional expectation is E(X|Y) = ∑x P(X=x|Y) * x, where X and Y are random variables and P(X=x|Y) is the probability that X takes on a certain value given the value of Y. This formula is based on the definition of conditional probability.

3. What is the relationship between conditional expectation and conditional probability?

Conditional probability and conditional expectation are closely related concepts. Conditional probability refers to the likelihood of an event occurring given that another event has already occurred, while conditional expectation refers to the average value of a random variable given the knowledge of another random variable.

4. Why is conditional expectation important?

Conditional expectation is important in many areas of mathematics and statistics, including probability theory, machine learning, and econometrics. It allows us to make predictions and draw conclusions based on incomplete information, which is often the case in real-world scenarios.

5. How is conditional expectation used in real-world applications?

Conditional expectation has numerous applications in fields such as finance, economics, and engineering. For example, it can be used to model stock prices, make predictions about future economic trends, and analyze the performance of complex systems.

Similar threads

  • Calculus and Beyond Homework Help
Replies
0
Views
240
  • General Math
Replies
3
Views
830
Replies
0
Views
442
  • MATLAB, Maple, Mathematica, LaTeX
Replies
1
Views
201
  • Advanced Physics Homework Help
Replies
1
Views
411
Replies
76
Views
4K
Replies
66
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
553
Replies
2
Views
1K
Back
Top