<< Chapter < Page Chapter >> Page >

Questions or comments concerning this laboratory should be directedto Prof. Charles A. Bouman, School of Electrical and Computer Engineering, Purdue University, West Lafayette IN 47907;(765) 494-0340; bouman@ecn.purdue.edu

Bivariate distributions

In this section, we will study the concept of a bivariate distribution.We will see that bivariate distributions characterize how two random variables are related to each other.We will also see that correlation and covariance are two simple measures of the dependencies between random variables,which can be very useful for analyzing both random variables and random processes.

Background on bivariate distributions

Sometimes we need to account for not just one random variable, but several. In this section, we will examine the case of two randomvariables–the so called bivariate case–but the theory is easily generalized to accommodate more than two.

The random variables X and Y have cumulative distribution functions (CDFs) F X ( x ) and F Y ( y ) , also known as marginal CDFs. Since there may be an interaction between X and Y , the marginal statistics may not fully describe their behavior.Therefore we define a bivariate , or joint CDF as

F X , Y ( x , y ) = P ( X x , Y y ) .

If the joint CDF is sufficiently “smooth”, we can define a joint probability density function,

f X , Y ( x , y ) = 2 x y F X , Y ( x , y ) .

Conversely, the joint probability density function may be used to calculate thejoint CDF:

F X , Y ( x , y ) = - y - x f X , Y ( s , t ) d s d t .

The random variables X and Y are said to be independent if and only if their joint CDF (or PDF) is a separable function, which means

f X , Y ( x , y ) = f X ( x ) f Y ( y )

Informally, independence between random variables means that one random variable does not tell you anything about the other.As a consequence of the definition, if X and Y are independent, then the product of their expectations is the expectation of their product.

E [ X Y ] = E [ X ] E [ Y ]

While the joint distribution contains all the information about X and Y , it can be very complex and is often difficult to calculate. In many applications, a simple measure of the dependenciesof X and Y can be very useful. Three such measures are the correlation , covariance , and the correlation coefficient .

  • Correlation
    E [ X Y ] = - - x y f X , Y ( x , y ) d x d y
  • Covariance
    E [ ( X - μ X ) ( Y - μ Y ) ] = - - ( x - μ X ) ( y - μ Y ) f X , Y ( x , y ) d x d y
  • Correlation coefficient
    ρ X Y = E [ ( X - μ X ) ( Y - μ Y ) ] σ X σ Y = E [ X Y ] - μ X μ Y σ X σ Y

If the correlation coefficient is 0, then X and Y are said to be uncorrelated . Notice that independence implies uncorrelatedness,however the converse is not true.

Samples of two random variables

In the following experiment, we will examine the relationship between the scatter plots for pairs of random samples ( X i , Z i ) and their correlation coefficient. We will see that the correlation coefficient determines the shape ofthe scatter plot.

Let X and Y be independent Gaussian random variables, each with mean 0 and variance 1. We will consider the correlation between X and Z , where Z is equal to the following:

  1. Z = Y
  2. Z = ( X + Y ) / 2
  3. Z = ( 4 * X + Y ) / 5
  4. Z = ( 99 * X + Y ) / 100

Notice that since Z is a linear combination of two Gaussian random variables, Z will also be Gaussian.

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Purdue digital signal processing labs (ece 438). OpenStax CNX. Sep 14, 2009 Download for free at http://cnx.org/content/col10593/1.4
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Purdue digital signal processing labs (ece 438)' conversation and receive update notifications?

Ask