+ Reply to Thread
Results 1 to 3 of 3

Thread: Independence of Gaussian and non-Gaussian variables

  1. #1
    Points: 2,053, Level: 27
    Level completed: 36%, Points required for next Level: 97

    Posts
    6
    Thanks
    2
    Thanked 0 Times in 0 Posts

    Independence of Gaussian and non-Gaussian variables




    Hi,

    If X is a Gaussian random variable, and Y is a non-Gaussian random variable, what are the necessary conditions for X and Y to be independent? I'm only interested in the case where the pdf of Y is continuous with well-defined derivatives (but is otherwise unspecified).

    If X and Y were jointly Gaussian, it would be sufficient to establish that they had vanishing covariance, I'm fine with that.

    If X and Y were both non-Gaussian, my understanding is that it would in general be necessary to establish that cross-moments take the form E(X^m Y^n) = E(X^m) E(Y^n) for all integer m, n > 0 (where E is expectation).

    Does the fact that X is Gaussian simplify the situation even without knowing anything about the properties of Y? Certainly X and Y being uncorrelated is insufficient (consider the case Y=X^2 for instance). I can't think of a case where <XY> = 0 and <X^2 Y> = 0 yet X and Y are still dependent but I'm not sure how to go about proving / disproving this.

    I'm an astrophysicist, with little formal statistics training. This question arose in connection with my research, although since it's not related to some specific data I thought I would ask here rather than in the Applied Statistics forum.

    Thanks for any help or suggestions!

  2. #2
    TS Contributor
    Points: 22,410, Level: 93
    Level completed: 6%, Points required for next Level: 940

    Posts
    3,020
    Thanks
    12
    Thanked 565 Times in 537 Posts

    Re: Independence of Gaussian and non-Gaussian variables

    The most general definition for two independent continuous random variables X, Y is that their joint pdf f_{X,Y} satisfy

    f_{X,Y}(x,y) = f_X(x)f_Y(y) ~~ \forall x, y \in \mathbb{R}

    where f_X, f_Y are the marginal pdf of X, Y respectively.

    For the other necessary and sufficient condition, based on expectation is that

    E[f(X)g(Y)] = E[f(X)]E[g(Y)]

    for all continuous functions f, g (whenever the expectations exist)

    For compact supported random variables, we have Stone-Weierstrass theorem to guarantee that these continuous functions can be well approximated by polynomial, so that your assertion will holds. However for random variables with unbounded support, it will be another story.

    Another condition, should be based on the mgfs (if exist)

    M_{X,Y}(s,t) = M_X(s)M_Y(t)

    due to the uniqueness of Laplace transform. And not surprisingly you have the equivalent condition based on the characteristic function due to the uniqueness of Fourier transform.

    Currently I cannot think of anyway to relax these conditions if you just specify one of the random variable to be Gaussian, as the other one is still quite arbitrary. I think one point maybe useful is that a multivariate Gaussian distribution is an elliptical distribution. So if you specified Y to be another elliptical distribution, something may happen. I am not sure.

  3. The Following User Says Thank You to BGM For This Useful Post:

    graeme (05-10-2014)

  4. #3
    Points: 2,053, Level: 27
    Level completed: 36%, Points required for next Level: 97

    Posts
    6
    Thanks
    2
    Thanked 0 Times in 0 Posts

    Re: Independence of Gaussian and non-Gaussian variables


    Thanks for the reply. Yes I agree these conditions must still hold, I guess what I was wondering (and should have written more clearly) is whether demonstrating these conditions is easier when one variable is known to be Gaussian. For example, looking at the expectation condition you wrote, and assuming compact support for Y, does the fact X is Gaussian reduce the number of polynomial combinations I have to show satisfy the condition?

    I thought about writing the characteristic function of Y in terms of a Gaussian characteristic function, and then seeing if I can write down the interdependence of X and Y in terms of this new Gaussian and X, but so far I haven't gotten anything useful out of that.

+ Reply to Thread

           




Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts






Advertise on Talk Stats