Characteristic function of joint distribution

In summary, the user is looking for the joint characteristic function of two non-independent probability distributions: a normal distribution with mean 0 and variance n, and a chi-squared distribution with n degrees of freedom. The joint characteristic function can be obtained through the product of the respective characteristic functions of the individual distributions. However, for non-independent Gaussian distributions, the product formula includes the correlation coefficient. It is recommended to add the smaller sample to the larger if possible, or to treat them as a single univariate Gaussian population. The joint characteristic function can be written as an integral of the marginal characteristic functions, and if the variables are independent, the joint characteristic function is the product of the marginal characteristic functions. However, in this case, the joint density
  • #1
shoplifter
27
0
What exactly is a "joint characteristic function"? I want the characteristic function of the joint distribution of two (non-independent) probability distributions. I'll state the problem below for clarity. So my two distributions are the normal distribution with mean 0 and variance n, and the chi squared distribution with n degrees of freedom. I know their individual characteristic functions, but how do I proceed?
 
Physics news on Phys.org
  • #2
shoplifter said:
What exactly is a "joint characteristic function"? I want the characteristic function of the joint distribution of two (non-independent) probability distributions. I'll state the problem below for clarity. So my two distributions are the normal distribution with mean 0 and variance n, and the chi squared distribution with n degrees of freedom. I know their individual characteristic functions, but how do I proceed?


The characteristic function is the Fourier transform of the PDF (it can also be derived if no PDF exists). The distribution of the sum of PDFs can be obtained through the product of the respective CFs.

You could just multiply the relevant CFs but I'm not sure this is correct in your case. For two non independent Gaussian distributions the product formula includes the correlation coefficient. Since the [tex]\chi^{2}[/tex] approaches the normal I would recommend adding the smaller sample to the larger if possible and treat it as from a single univariate Gaussian population . If you can't do this, I would question why you would want to evaluate a bivariate distribution where the samples are apparently not compatible.
 
Last edited:
  • #3
The joint characteristic function is

[tex]
\phi_{X,Y}(s,t) = \iint e^{i(sx + ty)} \,dF(x,y) = \iint e^{i(sx + ty)} f(x,y) \, dx dy
[/tex]

(the latter only if the joint distribution is continuous so that there is a density). If the variables are independent, the joint c.f. is the product of the marginal c.f.s; that isn't your case.

You state that X is normal with mean 0 and variance n, and Y is chi-square with n degrees of freedom. If that means this:
The distribution of X given Y is normal, [tex] \mu = 0, \sigma^2 = n [/tex] , you can do this.

As noted, the joint c.f. is

[tex]
\phi_{X,Y}(s,t) = \iint e^{i(sx + ty)} \,dF(x,y) = \iint e^{i(sx + ty)} f(x,y) \, dx dy
[/tex]

In your case the joint density isn't the product of the marginals, but you can write

[tex]
f(x,y) = f(x|Y=y) \cdot g(y)
[/tex]

where [tex] f(x|Y=y) [/tex] is the a normal density with mean 0 and variance n, and [tex] g(y) [/tex] is the density for the chi-square distribution with n degrees of freedom. Then

[tex]
\phi_{X,Y} = \iint e^{i(sx+ty)} \,f(x,y) dx dy = \int\left(\int e^{isx} f(x|Y=y) \,dx\right) e^{ity} g(y) \, dy
[/tex]

The expression in the inner integral is simply the c.f. for the normal distribution (mean = 0, variance = n), so you can evaluate that immediately. What's left is to take the integral of that with respect to the chi-square density.
 
  • #4
statdad said:
[tex]
\phi_{X,Y} = \iint e^{i(sx+ty)} \,f(x,y) dx dy = \int\left(\int e^{isx} f(x|Y=y) \,dx\right) e^{ity} g(y) \, dy
[/tex]

The expression in the inner integral is simply the c.f. for the normal distribution (mean = 0, variance = n), so you can evaluate that immediately. What's left is to take the integral of that with respect to the chi-square density.

The reason I didn't suggest something like this was that we don't know the correlation. [tex]\rho[/tex] is generally assumed to be valid for normally distributed data. We don't even know if the chi square is central or non central. How would you handle this?
 
  • #5
Basically I gambled. I took the OP's post as giving all relevant information - that one distribution was normal, [itex] \mu = 0, \sigma^2 = n [/itex], the other [itex] \chi^2 [/itex] with [itex] n [/itex] degrees of freedom.
In a sense, since we don't have two normal distributions, the [itex] n [/itex] is like the correlation coefficient.
Also, if the chi-square distribution is non-central, the only thing that changes is that the second of the two integrations becomes more difficult.

I must admit one more thing in addition to my gamble: I guessed (I don't think they are the same thing here.) It is rather common for questions in a similar vein to be given when both distributions are discrete, with the only link of dependence being the item specified. I guessed the same case would hold here.

So there you have it. If the details in the first post were complete, I'm okay. If they weren't, what's missing will be supplied.

If you have a different take I'd be interested. I hope I haven't overstepped bounds by doing this.
 
  • #6
statdad said:
In a sense, since we don't have two normal distributions, the [itex] n [/itex] is like the correlation coefficient.
Also, if the chi-square distribution is non-central, the only thing that changes is that the second of the two integrations becomes more difficult.

So there you have it. If the details in the first post were complete, I'm okay. If they weren't, what's missing will be supplied.

If you have a different take I'd be interested. I hope I haven't overstepped bounds by doing this.

No problem. I was just thinking of this as a problem in applied statistics. The chi square would apply to a small sample. It just seemed odd to try define a bivariate distribution in these terms, especially when they are termed "non independent". An explicit expression for the joint characteristic function for two non independent Gaussian PDFs is:

[tex]\phi(t_{1},t_{2})=exp[i(t_{1}\mu_{1}+t_{2}\mu_{2})-1/2(\sigma_{1}^{2}t_{1}^{2}+2\rho \sigma_{1} \sigma_{2} t_{1}t_{2}+\sigma_{2}^{2} t_{2}^{2})][/tex]

I also didn't know what the OP meant by a variance of n. Does that convey something about the relationship with the chi square?
 
  • #7
The variance of the normal distribution equals the number of degrees of freedom of the chi square distribution.
 
  • #8
statdad said:
The variance of the normal distribution equals the number of degrees of freedom of the chi square distribution.

OK I understand that, but I was under the impression the OP was talking about two distinct distributions. Are we to assume that k=n in this case?

EDIT: Whoops. I see it. The OP defined k=n.
 
Last edited:
  • #9
thank you for your detailed responses. However, the original question I am trying to solve does not say "X is normal given that Y is chi-squared". It says something like, okay, here are n identically distributed independent standard normal variables, and let X be their sum, and Y be their square-sum (which is why I stated that X is N(0, n) and Y is chi-squared with n degrees of freedom). Then find the characteristic function of the joint distribution of X and Y. That changes things quite a bit, right?
 
  • #10
shoplifter said:
thank you for your detailed responses. However, the original question I am trying to solve does not say "X is normal given that Y is chi-squared". It says something like, okay, here are n identically distributed independent standard normal variables, and let X be their sum, and Y be their square-sum (which is why I stated that X is N(0, n) and Y is chi-squared with n degrees of freedom). Then find the characteristic function of the joint distribution of X and Y. That changes things quite a bit, right?

That really isn't what you wrote in your original question. Instead of "something like..." can you post the exact wording?
 
  • #11
yes, I apologize. Suppose A_1, ..., A_n are iid standard normal variables, and say X = A_1 + ... + A_n, and Y = A_1^2 + ... + A_n^2. Then what's the char. func. of the joint probability distribution of X and Y?

Apologies again for not being clear before.
 
  • #12
shoplifter said:
yes, I apologize. Suppose A_1, ..., A_n are iid standard normal variables, and say X = A_1 + ... + A_n, and Y = A_1^2 + ... + A_n^2. Then what's the char. func. of the joint probability distribution of X and Y?

Apologies again for not being clear before.

Here it's easier to use the definition

[tex]\phi_{X,Y}(s,t) = \mathbb{E}[e^{isX+itY}][/tex]

which reduces to

[tex]\mathbb{E}[e^{isA_1+itA_1^2}]^n[/tex]

by independence. The latter expectation expressed as an integral can be solved by completing the square.
 
  • #13
so I get the characteristic function to be [tex]\mathbb{E}e^{(-ins^2/4t + in(A_1\sqrt{t} + s/2\sqrt{t})^2)}[/tex]. I'm guessing we can take the first (constant) term out of the expectation, as [tex]\mathbb{E}(c) = c[/tex]. But I don't see an immediate way to calculate the second term, because the integral is too unwieldy. Any help would be much appreciated.

As a second small question, what exactly does this quantity measure?
 
  • #14
For the answer (which is [TEX]\mathbb{E}[e^{isX+itY}][/TEX]), I am getting the following quantity raised to power n:

[TEX]\frac{1}{2\pi}\int_{-\infty}^\infty e^{isx + itx^2}e^{-x^2/2}dx[/TEX]

Is this correct? Thanks.
 
  • #15
Sorry, the previous post doesn't seem to display equations correctly: I meant, I found the value of [tex]\mathbb{E}[e^{isX+itY}][/tex] to be

[tex]\frac{1}{2\pi}\int_{-\infty}^\infty e^{isx + itx^2}e^{-x^2/2}dx[/tex].
 
  • #16
shoplifter said:
Sorry, the previous post doesn't seem to display equations correctly: I meant, I found the value of [tex]\mathbb{E}[e^{isX+itY}][/tex] to be

[tex]\frac{1}{2\pi}\int_{-\infty}^\infty e^{isx + itx^2}e^{-x^2/2}dx[/tex].

Instead of X,Y I think you need to use [tex]A,A^2[/tex]

So [tex]\int_x e^{isA_{1}+itA_{1}^2}ndx[/tex]
following bpet's suggestion.
 
Last edited:

Related to Characteristic function of joint distribution

What is a characteristic function of joint distribution?

A characteristic function of joint distribution is a mathematical function that describes the probability distribution of two or more random variables. It is a generalization of the characteristic function of a single random variable.

How is a characteristic function of joint distribution different from a probability density function?

A characteristic function of joint distribution is a complex-valued function, while a probability density function is a real-valued function. Additionally, a characteristic function uniquely defines a probability distribution, while a probability density function does not.

What information can be obtained from a characteristic function of joint distribution?

A characteristic function of joint distribution can provide information about the moments of the random variables, such as the mean and variance. It can also be used to calculate probabilities and identify the type of distribution.

What is the relationship between the characteristic function of joint distribution and the moment generating function?

The moment generating function is the real part of the characteristic function of joint distribution. This means that the moment generating function can be obtained by taking the inverse Fourier transform of the characteristic function.

How are characteristic functions of joint distribution used in statistics?

Characteristic functions of joint distribution are used in statistics to analyze the relationship between multiple random variables and to make predictions about their behavior. They are also used to derive other statistical functions, such as the correlation coefficient and covariance.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
444
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
768
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
611
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
Replies
5
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
994
Back
Top