<< Chapter < Page Chapter >> Page >
A description of typical sequences.

If the binary symmetric channel has crossover probability then if x is transmitted then by the Law of Large Numbers the output y is different from x in n places if n is very large.

d H x y n
The number of sequences of length n that are different from x of length n at n is
n n n n n n

x 0 0 0 and 1 3 and n 3 1 3 . The number of output sequences different from x by one element: 3 1 2 3 2 1 1 2 3 given by 1 0 1 , 0 1 1 , and 0 0 0 .

Got questions? Get instant answers now!

Using Stirling's approximation

n n n n 2 n
we can approximate
n n 2 n 2 logbase --> 1 2 logbase --> 1 2 n H b
where H b 2 logbase --> 1 2 logbase --> 1 is the entropy of a binary memoryless source. For any x there are 2 n H b highly probable outputs that correspond to this input.

Consider the output vector Y as a very long random vector with entropy n H Y . As discussed earlier , the number of typical sequences (or highly probably) is roughly 2 n H Y . Therefore, 2 n is the total number of binary sequences, 2 n H Y is the number of typical sequences, and 2 n H b is the number of elements in a group of possible outputs for one input vector. The maximum number of input sequences thatproduce nonoverlapping output sequences

M 2 n H Y 2 n H b 2 n H Y H b

The number of distinguishable input sequences of length n is

2 n H Y H b
The number of information bits that can be sent across the channel reliably per n channel uses n H Y H b The maximum reliable transmission rate per channel use
R 2 logbase --> M n n H Y H b n H Y H b
The maximum rate can be increased by increasing H Y . Note that H b is only a function of the crossover probability and can not be minimized any further.

The entropy of the channel output is the entropy of a binary random variable. If the input is chosen to be uniformly distributed with p X 0 p X 1 1 2 .

Then

p Y 0 1 p X 0 p X 1 1 2
and
p Y 1 1 p X 1 p X 0 1 2
Then, H Y takes its maximum value of 1. Resulting in a maximum rate R 1 H b when p X 0 p X 1 1 2 . This result says that ordinarily one bit is transmitted across a BSC withreliability 1 . If one needs to have probability of error to reach zero then oneshould reduce transmission of information to 1 H b and add redundancy.

Recall that for Binary Symmetric Channels (BSC)

H Y X p x 0 H Y X 0 p x 1 H Y X 1 p x 0 1 2 logbase --> 1 2 logbase --> p x 1 1 2 logbase --> 1 2 logbase --> 1 2 logbase --> 1 2 logbase --> H b
Therefore, the maximum rate indeed was
R H Y H Y X X Y

The maximum reliable rate for a BSC is 1 H b . The rate is 1 when 0 or 1 . The rate is 0 when 1 2

Got questions? Get instant answers now!

This module provides background information necessary for an understanding of Shannon's Noisy Channel Coding Theorem . It is also closely related to material presented in Mutual Information .

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Digital communication systems. OpenStax CNX. Jan 22, 2004 Download for free at http://cnx.org/content/col10134/1.3
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Digital communication systems' conversation and receive update notifications?

Ask