<< Chapter < Page | Chapter >> Page > |
In order to study the characteristics of a random process , let us look at some of the basic properties and operations of a random process. Below wewill focus on the operations of the random signals that compose our random processes. We will denote our random process with $X$ and a random variable from a random process or signal by $x$ .
Finding the average value of a set of random signals or random variables is probably the most fundamental concepts we use inevaluating random processes through any sort of statistical method. The mean of a random process is the average of all realizations of that process. In order to find this average, we must look at a random signal over arange of time (possible values) and determine our average from this set of values. The mean , or average, of a random process, $x(t)$ , is given by the following equation:
If the random variables, which make up our random process, are discrete or quantized values, such as in a binary process,then the integrals become summations over all the possible values of the random variable. In this case, our expectedvalue becomes
In the case where we have a random process in which only one sample can be viewed at a time, then we will often not haveall the information available to calculate the mean using the density function as shown above. In this case we mustestimate the mean through the time-average mean , discussed later. For fields such as signal processing that deal mainly withdiscrete signals and values, then these are the averages most commonly used.
If we look at the second moment of the term (we now look at $x^{2}$ in the integral), then we will have the mean-square value of our random process. As you would expect, this is written as
Now that we have an idea about the average value or values that a random process takes, we are often interested in seeingjust how spread out the different random values might be. To do this, we look at the variance which is a measure of this spread. The variance, often denoted by $^{2}$ , is written as follows:
Another common statistical tool is the standard deviation. Once you know how to calculate the variance, the standarddeviation is simply the square root of the variance , or $$ .
In the case where we can not view the entire ensemble of the random process, we must use time averages to estimate thevalues of the mean and variance for the process. Generally, this will only give us acceptable results for independent and ergodic processes, meaning those processes in which each signal or member of the process seems to have thesame statistical behavior as the entire process. The time averages will also only be taken over a finite interval sincewe will only be able to see a finite part of the sample.
For the ergodic random process, $x(t)$ , we will estimate the mean using the time averaging function defined as
Once the mean of our random process has been estimated then we can simply use those values in the following varianceequation (introduced in one of the above sections)
Let us now look at how some of the formulas and concepts above apply to a simple example. We will just look at a single,continuous random variable for this example, but the calculations and methods are the same for a random process.For this example, we will consider a random variable having the probability density function described below and shown in .
First, we will use to solve for the mean value.
Notification Switch
Would you like to follow the 'Fundamentals of signal processing' conversation and receive update notifications?