Independent random varables with common expectation and variance

In summary, the question asks for the variance of the sum of independent random variables with common expectation and variance. By using the formulas for expected value and variance, and the fact that independent random variables have zero covariance, it can be shown that the variance of the sum is equal to the number of variables multiplied by the variance of each variable. This implies that the variance of Sn is n times the variance of each X.
  • #1
HotMintea
43
0
Homework Statement

Suppose X1 , X2 , . . . , Xn are independent random variables, with common expectation μ and variance σ^2 . Let Sn = X1 + X2 + · · · + Xn . Find the variance of Sn.

The attempt at a solution

Expected value:

[itex] E[S_n] = n E[X_i] = n\mu \hspace{10 cm} [/itex] (1)

Variance:

[itex] Var[S_n] = E[S_n^2] - E[S_n]^2 = E[S_n^2] - n^2 \mu^2 \hspace{7 cm}[/itex] (2) # Substituted (1).

[itex] \displaystyle E[S_n^2] = E[\sum_{i=1}^n X_i^2] + 2 E[\sum_{j=1}^n\sum_{k\ >\ j}^n X_jX_k] = n E[X_i^2] + n(n - 1) E[X_jX_k] \hspace{1 cm}[/itex] (3) # Expanded Sn.

[itex] Var[ X_i ] = E[X_i^2] + E[X_i]^2 = \sigma^2\ \rightarrow\ E[X_i^2] = \sigma^2 + \mu^2 \hspace{5 cm} [/itex] (4)

[itex] \displaystyle E[S_n] = n(\sigma^2+\mu^2 + (n - 1) E[X_jX_k]) \hspace{7 cm} [/itex] (5) # Substituted (4) into (3).

I'm stuck here.

If I knew the covariance of Xj and Xk, then I could use the following formula:

[itex] Covar[X_j, X_k] = E[X_j X_k] - E[X_j]E[X_k][/itex]

[itex] \rightarrow\ E[X_j X_k] = Covar[X_j, X_k] + E[X_j] E[X_k] = Covar[X_j, X_k] + \mu^2 \hspace{1 cm}[/itex] (6)

I suspect that "independent random variables with common expectation and variance" implies a certain relation that is necessary for this question.

Can someone give me a hint please?
 
Last edited:
Physics news on Phys.org
  • #2
Do independent random variables have any covariance?
 
  • #3
HotMintea said:
[itex] \displaystyle E[S_n] = n(\sigma^2+\mu^2 + (n - 1) E[X_jX_k]) \hspace{7 cm} [/itex] (5) # Substituted (4) into (3).

I meant [itex] E[S_n^2] = [/itex].

obafgkmrns said:
Do independent random variables have any covariance?

I found the proof of [itex] E[X]E[Y] = E[XY] [/itex], if X and Y are random variables, which basically uses [itex] P(X,Y) = P(X)P(Y). \hspace{2 cm} [/itex] http://webpages.dcu.ie/~applebyj/ms207/RV2.pdf"

So now I get that [itex] Var[S_n] = n\sigma^2 [/itex].

Thanks for the help :smile:
 
Last edited by a moderator:

Related to Independent random varables with common expectation and variance

What are independent random variables with common expectation and variance?

Independent random variables with common expectation and variance are two or more random variables that are not affected by each other and have the same average value and variability.

What does it mean for random variables to be independent?

When random variables are independent, it means that the outcome of one variable does not affect the outcome of another. In other words, the variables do not influence each other and are not related in any way.

How is the expectation of independent random variables calculated?

The expectation of independent random variables is calculated by taking the sum of the individual expectations of each variable. This is because the expectation is a linear operator, meaning it can be broken down into individual components when dealing with independent variables.

What is the variance of independent random variables?

The variance of independent random variables is equal to the sum of the individual variances of each variable. This is because the variance is also a linear operator and can be broken down into individual components for independent variables.

How are independent random variables useful in statistics and probability?

Independent random variables are useful in statistics and probability because they allow us to model and analyze complex systems by breaking them down into simpler, independent components. This makes calculations and predictions easier and more accurate. Additionally, many statistical methods and tests rely on the assumption of independence between variables.

Similar threads

  • Precalculus Mathematics Homework Help
Replies
14
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Precalculus Mathematics Homework Help
Replies
9
Views
2K
  • Calculus and Beyond Homework Help
Replies
0
Views
239
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Precalculus Mathematics Homework Help
Replies
1
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
Back
Top