<< Chapter < Page Chapter >> Page >
Introduction to Information and Entropy

In this and the following modules the basic concepts of information theory will be introduced. For simplicity we assume that the signalsare time discrete. Time discrete signals often arise from sampling a time continous signal. The assumption of time discrete signal is valid becausewe will only be looking at bandlimited signals . (Which can, as we know , be perfectly reconstructed).

In treating time discrete signal and their information content we have to distinguish between two types of signals:

  • signals have amplitude levels belonging to a finite set
  • signals that have amplitudes taken from the real line
In the first case we can measure the information content in terms of entropy , while in the second case the entropy is infinte and we must resort to characterise the source by means of differential entropy .

Examples of information sources

The signals treated are mainly of a stochastic nature, i.e. the signal is unknown to us.Since the signal is not known to the destination (because of it's stochastic nature), it is then bestmodeled as a random process, discrete-time or continuous time. Examples of information sources that we model as random processes are:

  • Digital data source (e.g. a text) can be modeled as a random process.
  • Video signals can be modeled as a random process. Such signals are mainly bandlimited toaround 5 MHz (the value depends on the standards used to raster the frames of image).
  • Audio signals can be modeled as a random process. Speech is typically between 300 Hz and3400 Hz, see .
Power spectral density plot of speech

Video and speech are analog information signals are bandlimited. Therefore, if sampled faster than two times the highest fequency component, they can be reconstructedfrom their sample values.

A speech signal with bandwidth of 3100 Hz can be sampled at the rate of 6.2 KHz. If the samples are quantized with a 8level quantizer then the speech signal can be represented with a binary sequence with bit rate

6200 2 logbase --> 8 18600 bits/sec

Analog speech signal sampled and quantised

The sampled real values can be quantized to create a discrete-time discrete-valued random process.

Got questions? Get instant answers now!

The core of information theory

The key observation from the discussion above is that for a reveiver the signals are unknown . It is exact this uncertainty that enables the signalto transmit information. This is the core of information theory:

Information transfer happens when the receiver is unable to know or predict at message before it isreceived.

Some statistics

Here we present some statistics with the intent of reviewing a few basic concepts and to introduce the notation.

Let X be a stochastic variable. Let X x i and X x j denote two outcomes of X.

  • Dependent outcomes implies: X x i X x j X x i x i X x j X x j x j X x i
  • Independent outcomes implies X x i X x j X x i X x j
  • Bayes' rule: x i X x j x j X x i X x j X x i
More about basic probability theory and a derivation of Bayes' rule can be found here .

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Information and signal theory. OpenStax CNX. Aug 03, 2006 Download for free at http://legacy.cnx.org/content/col10211/1.19
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Information and signal theory' conversation and receive update notifications?

Ask