<< Chapter < Page Chapter >> Page >

Two (or more) random variables can be defined over the same sample space. Just as with jointly defined events, the joint distribution function is easily defined.

P X Y x y X x Y y
The joint probability density function p X Y x y is related to the distribution function via double integration.
P X Y x y x y p X Y
or p X Y x y x y P X Y x y Since y P X Y x y P X x , the so-called marginal density functions can be related to the joint density function.
p X x p X Y x
and p Y y p X Y y

Extending the ideas of conditional probabilities, the conditional probability density function p X | Y x | Y = y is defined (when p Y y 0 ) as

p X | Y x | Y = y p X Y x y p Y y
Two random variables are statistically independent when p X | Y x | Y = y p X x , which is equivalent to the condition that the joint density function is separable: p X Y x y p X x p Y y .

For jointly defined random variables, expected values are defined similarly as with single random variables. Probably themost important joint moment is the covariance :

cov X Y X Y X Y
where X Y y x x y p X Y x y Related to the covariance is the (confusingly named) correlation coefficient : the covariance normalized by the standard deviations of the component random variables. p X , Y cov X Y X Y When two random variables are uncorrelated , their covariance and correlation coefficient equals zero so that X Y X Y . Statistically independent random variables are always uncorrelated, but uncorrelated random variables can bedependent.
Let X be uniformly distributed over -1 1 and let Y X 2 . The two random variables are uncorrelated, but are clearly not independent.

A conditional expected value is the mean of the conditional density.

Y X x p X | Y x | Y = y
Note that the conditional expected value is now a function of Y and is therefore a random variable. Consequently, it too has an expected value, which is easily evaluated to be the expected value of X .
Y X y x x p X | Y x | Y = y p Y y X
More generally, the expected value of a function of two random variables can be shown to be the expected value of a conditionalexpected value: f X Y Y f X Y . This kind of calculation is frequently simpler to evaluate than trying to find the expected value of f X Y "all at once." A particularly interesting example of this simplicity is the random sum of random variables . Let L be a random variable and X l a sequence of random variables. We will find occasion to consider the quantity l 1 L X l . Assuming that each component of the sequence has the same expected value X , the expected value of the sum is found to be
S L L l 1 L X l L X L X

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Statistical signal processing. OpenStax CNX. Dec 05, 2011 Download for free at http://cnx.org/content/col11382/1.1
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Statistical signal processing' conversation and receive update notifications?

Ask