<< Chapter < Page | Chapter >> Page > |
Questions or comments concerning this laboratory should be directedto Prof. Charles A. Bouman, School of Electrical and Computer Engineering, Purdue University, West Lafayette IN 47907;(765) 494-0340; bouman@ecn.purdue.edu
In this section, we will study the concept of a bivariate distribution.We will see that bivariate distributions characterize how two random variables are related to each other.We will also see that correlation and covariance are two simple measures of the dependencies between random variables,which can be very useful for analyzing both random variables and random processes.
Sometimes we need to account for not just one random variable, but several. In this section, we will examine the case of two randomvariables–the so called bivariate case–but the theory is easily generalized to accommodate more than two.
The random variables $X$ and $Y$ have cumulative distribution functions (CDFs) ${F}_{X}\left(x\right)$ and ${F}_{Y}\left(y\right)$ , also known as marginal CDFs. Since there may be an interaction between $X$ and $Y$ , the marginal statistics may not fully describe their behavior.Therefore we define a bivariate , or joint CDF as
If the joint CDF is sufficiently “smooth”, we can define a joint probability density function,
Conversely, the joint probability density function may be used to calculate thejoint CDF:
The random variables $X$ and $Y$ are said to be independent if and only if their joint CDF (or PDF) is a separable function, which means
Informally, independence between random variables means that one random variable does not tell you anything about the other.As a consequence of the definition, if $X$ and $Y$ are independent, then the product of their expectations is the expectation of their product.
While the joint distribution contains all the information about $X$ and $Y$ , it can be very complex and is often difficult to calculate. In many applications, a simple measure of the dependenciesof $X$ and $Y$ can be very useful. Three such measures are the correlation , covariance , and the correlation coefficient .
If the correlation coefficient is 0, then $X$ and $Y$ are said to be uncorrelated . Notice that independence implies uncorrelatedness,however the converse is not true.
In the following experiment, we will examine the relationship between the scatter plots for pairs of random samples $({X}_{i},{Z}_{i})$ and their correlation coefficient. We will see that the correlation coefficient determines the shape ofthe scatter plot.
Let $X$ and $Y$ be independent Gaussian random variables, each with mean 0 and variance 1. We will consider the correlation between $X$ and $Z$ , where $Z$ is equal to the following:
Notice that since $Z$ is a linear combination of two Gaussian random variables, $Z$ will also be Gaussian.
Notification Switch
Would you like to follow the 'Purdue digital signal processing labs (ece 438)' conversation and receive update notifications?