<< Chapter < Page | Chapter >> Page > |
In this and the following modules the basic concepts of information theory will be introduced. For simplicity we assume that the signalsare time discrete. Time discrete signals often arise from sampling a time continous signal. The assumption of time discrete signal is valid becausewe will only be looking at bandlimited signals . (Which can, as we know , be perfectly reconstructed).
In treating time discrete signal and their information content we have to distinguish between two types of signals:
The signals treated are mainly of a stochastic nature, i.e. the signal is unknown to us.Since the signal is not known to the destination (because of it's stochastic nature), it is then bestmodeled as a random process, discrete-time or continuous time. Examples of information sources that we model as random processes are:
Video and speech are analog information signals are bandlimited. Therefore, if sampled faster than two times the highest fequency component, they can be reconstructedfrom their sample values.
A speech signal with bandwidth of 3100 Hz can be sampled at the rate of 6.2 KHz. If the samples are quantized with a 8level quantizer then the speech signal can be represented with a binary sequence with bit rate
The sampled real values can be quantized to create a discrete-time discrete-valued random process.
The key observation from the discussion above is that for a reveiver the signals are unknown . It is exact this uncertainty that enables the signalto transmit information. This is the core of information theory:
Here we present some statistics with the intent of reviewing a few basic concepts and to introduce the notation.
Let X be a stochastic variable. Let $X={x}_{i}$ and $X={x}_{j}$ denote two outcomes of X.
Notification Switch
Would you like to follow the 'Information and signal theory' conversation and receive update notifications?