<< Chapter < Page | Chapter >> Page > |
The two most common expectations are the mean ${\mu}_{X}$ and variance ${\sigma}_{X}^{2}$ defined by
A very important type of random variable is the Gaussian or normal random variable.A Gaussian random variable has a density function of the following form:
Note that a Gaussian random variable is completely characterized by its mean and variance.This is not necessarily the case for other types of distributions. Sometimes, the notation $X\sim N(\mu ,{\sigma}^{2})\phantom{\rule{4pt}{0ex}}$ is used to identify $X$ as being Gaussian with mean $\mu $ and variance ${\sigma}^{2}$ .
Suppose some random experiment may be characterized by a random variable $X$ whose distribution is unknown. For example, suppose we are measuring a deterministic quantity $v$ , but our measurement is subject to a random measurement error $\epsilon $ . We can then characterize the observed value, $X$ , as a random variable, $X=v+\epsilon $ .
If the distribution of $X$ does not change over time, we may gain further insight into $X$ by making several independent observations $\{{X}_{1},{X}_{2},\cdots ,{X}_{N}\}$ . These observations ${X}_{i}$ , also known as samples , will be independent random variables and have the same distribution ${F}_{X}\left(x\right)$ . In this situation, the ${X}_{i}$ 's are referred to as i.i.d. , for independent and identically distributed . We also sometimes refer to $\{{X}_{1},{X}_{2},\cdots ,{X}_{N}\}$ collectively as a sample, or observation, of size $N$ .
Suppose we want to use our observation $\{{X}_{1},{X}_{2},\cdots ,{X}_{N}\}$ to estimate the mean and variance of $X$ . Two estimators which should already be familiar to you are the sample mean and sample variance defined by
It is important to realize that these sample estimates are functions of random variables, and are therefore themselves random variables.Therefore we can also talk about the statistical properties of the estimators. For example, we can compute the mean and variance of the sample mean ${\widehat{\mu}}_{X}$ .
In both [link] and [link] we have used the i.i.d. assumption. We can also show that $E\left[{\widehat{\sigma}}_{X}^{2}\right]={\sigma}_{X}^{2}$ .
An estimate $\widehat{a}$ for some parameter $a$ which has the property $E\left[\widehat{a}\right]=a$ is said to be an unbiased estimate. An estimator such that $Var\left[\widehat{a}\right]\to 0$ as $N\to \infty $ is said to be consistent . These two properties are highly desirable because they imply that if alarge number of samples are used the estimate will be close to the true parameter.
Suppose
$X$ is a Gaussian random variable with mean 0 and variance 1.
Use the Matlab function
random
or
randn
to generate 1000 samples of
$X$ , denoted as
${X}_{1}$ ,
${X}_{2}$ , ...,
${X}_{1000}$ .
See the online help for the
random function .
Plot them using the Matlab function
plot
.
We will assume our generated samples are i.i.d.
Write Matlab functions to compute the sample mean and sample variance of
[link] and
[link] without
using the predefined
mean
and
var
functions.
Use these functions to compute the sample meanand sample variance of the samples you just generated.
Notification Switch
Would you like to follow the 'Purdue digital signal processing labs (ece 438)' conversation and receive update notifications?