PDA

View Full Version : Help Needed for Joint Distributions



mlin0823
06-09-2010, 11:43 PM
I'm fuzzy about the idea of joint distributions and would appreciate it if someone can please help me understand the concepts more clearly.

My understanding of a joint distribution is that it is a distribution of the random variables occuring at the same time or the intersection of 2 random events. ie, f(x,y) = Pr(X=x, Y=y)

I am unclear about whether this means that this joint distribution can be thought of as a function of random variables. ie Z = X * Y

If so, then is A = X + Y also a joint distribution? what about other functions of random variables?

Multiplying the densities together seems to be the way for getting the joint density when they are independent. What about when they are not independent?

I think my main problem is that I don't see the link (if there is any) between joint distributions and functions of random variables.

Can someone please help?? Thanks!!!

BGM
06-10-2010, 12:38 AM
The joint probability distribution has dimension higher than 1.
i.e. You are characterizing more than 1 random variables together,
with their own marginal behaviour and their joint, interacting behaviour.

The random variables, like Z = XY or A = X + Y ,
each of them is still univariate random variable. The probability distribution of
Z (or A ) is not the joint distribution of X, Y

The codomain of the functions you mentioned, is \mathbb{R}
i.e. g:\mathbb{R}^2 \rightarrow \mathbb{R}, g(x, y) = xy
h:\mathbb{R}^2 \rightarrow \mathbb{R}, h(x, y) = x + y
The probability of function of random variables and the joint probaility
distribution of random variables are two different concept.
Unless the codomain of the functions has a higher dimension,
you actually do not relate them.
Note that the joint distribution of X and Y = cX
does not exist as they are perfectly correlated and reduce to one dimension.

To find the joint probability mass/density function,
you can multiply the original marginal mass/density functions if they are
mutually independent:
f_{X, Y}(x, y) = f_X(x)f_Y(y)
If not, you need to find the conditional mass/density functions first
and then multiply by the marginal.
f_{X, Y}(x, y) = f_{X|Y=y}(x|y)f_Y(y) = f_{Y|X=x}(y|x)f_X(x)

Be caution that there is NO probability intepretation for a continuous
probability density function alone. The culmulative distribution function, i.e. the integral of the density has a probability intepretation.
For a continuous random variable X [\math],
[math] Pr\{X = x\} = 0 \forall x \in \mathbb{R}

mlin0823
06-10-2010, 04:37 AM
I thought I had this thing figured out until I stumbled upon an exercise where I had to find E[X*Y]....

The solutions used the double integral over all values for xy*joint probability density to find this expected value.

This confuses me since we are finding an expected value for a function of random variables using their joint density; which I thought were different concepts.

Also, if I were to find E[X/Y], then do I integrate over all values for x/y*joint probability density?

Can someone please help!! Thanks!!

BGM
06-10-2010, 11:02 AM
E[XY] = \int_{-\infty}^{+\infty}\int_{-\infty}^{+\infty}
xyf_{X,Y}(x, y)dxdy

If you find out the distribution of Z = XY first, we also have
E[XY] = E[Z] = \int_{-\infty}^{+\infty} zf_Z(z)dz

For other function, the formula is similar.