<< Chapter < Page Chapter >> Page >
  1. N = 4 , p ( x 1 ) = 0 . 5 , p ( x 2 ) = 0 . 25 , p ( x 3 ) = p ( x 4 ) = 0 . 125 . The total information is
    i = 1 4 I ( x i ) = i = 1 4 log 1 p ( x i ) = 1 + 2 + 3 + 3 = 9 bits .
    The entropy of the source is
    H ( x ) = 1 2 · 1 + 1 4 · 2 + 1 8 · 3 + 1 8 · 3 = 1 . 75 bits/symbol.
  2. N = 4 , p ( x i ) = 0 . 25 for all i . The total information is I ( x i ) = 2 + 2 + 2 + 2 = 8 . The entropy of the source is
    H ( x ) = 1 4 · 2 + 1 4 · 2 + 1 4 · 2 + 1 4 · 2 = 2 bits/symbol.

In Example [link] , messages of the same length from the first source give lessinformation than those from the second source. Hence, sources with the same number of symbols but different probabilitiescan have different entropies. The key is to design a system to maximize entropy since this will have the largest throughput, or largestaverage flow of information. But how can this be achieved?

First, consider the simple case in which there are two symbols in the alphabet, x 1 with probability p , and x 2 with probability 1 - p . (Think of a coin that is weighted so as to give heads with higher probability than tails.)Applying the definition [link] shows that the entropy is

H ( p ) = - p log ( p ) - ( 1 - p ) log ( 1 - p ) .

This is plotted as a function of p in [link] . For all allowable values of p , H ( p ) is positive. As p approaches either zero or one, H ( p ) approaches zero, which represent the symmetric cases in which either x 1 occurs all the time or x 2 occurs all the time, and no information is conveyed. H ( p ) reaches its maximum in the middle, at p = 0 . 5 . For this example, entropy is maximized whenboth symbols are equally likely.

Entropy of a binary signal with probabilities p and 1-p.
Entropy of a binary signal with probabilities p and 1 - p .

Show that H ( p ) is maximized at p = 0 . 5 by taking the derivative and setting it equal to zero.

The next result shows that an N -symbol source cannot have entropy larger than log ( N ) , and that this bound is achieved when all the symbols are equally likely.Mathematically, H ( x ) log ( N ) , which is demonstrated by showing that H ( x ) - log ( N ) 0 . First,

H ( x ) - log ( N ) = i = 1 N p ( x i ) log 1 p ( x i ) - log ( N ) = i = 1 N p ( x i ) log 1 p ( x i ) - i = 1 N p ( x i ) log ( N ) ,

since i = 1 N p ( x i ) = 1 . Gathering terms, this can be rewritten

H ( x ) - log ( N ) = i = 1 N p ( x i ) log 1 p ( x i ) - log ( N ) = i = 1 N p ( x i ) log 1 N p ( x i ) ,

and changing the base of the logarithm (using log ( z ) log 2 ( z ) = log 2 ( e ) ln ( z ) , where ln ( z ) log e ( z ) ), gives

H ( x ) - log ( N ) = log ( e ) i = 1 N p ( x i ) ln 1 N p ( x i ) .

If all symbols are equally likely, p ( x i ) = 1 / N , then 1 N p ( x i ) = 1 and ln 1 N p ( x i ) = ln ( 1 ) = 0 . Hence H ( x ) = log ( N ) . On the other hand, if the symbols are not equally likely, then the inequality ln ( z ) z - 1 (which holds for z 0 ) implies that

H ( x ) - log ( N ) log ( e ) i = 1 N p ( x i ) 1 N p ( x i ) - 1 = log ( e ) i = 1 N 1 N - i = 1 N p ( x i ) = log ( e ) 1 - 1 = 0 .

Rearranging [link] gives the desired bound on the entropy, that H ( x ) log ( N ) . This says that, all else being equal, it is preferableto choose a code in which each symbol occurs with the same probability.Indeed, Example [link] provides a concrete source for which the equal probability case has higherentropy than the unequal case.

"Redundancy" showed how letters in the text of natural languages do not occur with equal probability. Thus,naively using the letters will not lead to an efficient transmission. Rather, the letters must be carefully translatedinto equally probable symbols in order to increase the entropy. A method for accomplishing this translation is given in "Source Coding" , but "Channel Capacity" examines the limits of attainable performance when transmitting symbols across a noisy (but otherwise perfect) channel.

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Software receiver design. OpenStax CNX. Aug 13, 2013 Download for free at http://cnx.org/content/col11510/1.3
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Software receiver design' conversation and receive update notifications?

Ask