<< Chapter < Page | Chapter >> Page > |
Before Shannon it was commonly believed that the only way of achieving arbitrarily small probability of error on a communicationschannel was to reduce the transmission rate to zero. Today we are wiser. Informationtheory characterizes a channel by a single parameter; the channel capacity. Shannon demonstrated that it is possible to transmit information at any ratebelow capacity with an arbitrarily small probability of error.
—from A. R. Calderbank, “The Art of Signaling: Fifty Years of Coding Theory,” IEEE Transactions on Information Theory, p. 2561, October 1998.
The underlying purpose of any communication system is to transmit information. But what exactly is information? How is it measured?Are there limits to the amount of data that can be sent over a channel, even when all the parts of the system are operatingat their best? This chapter addresses these fundamental questions usingthe ideas of Claude Shannon (1916–2001), who defined a measure of information in terms of bits.The number of bits per second that can be transmitted over the channel (taking into account its bandwidth,the power of the signal, and the noise) is called the bit rate , and can be used to define the capacity of the channel.
Unfortunately, Shannon's results do not give a recipe for how to construct a system that achieves the optimal bit rate.Earlier chapters have highlighted several problems that can arise in communication systems (including synchronization errors suchintersymbol interference). This chapter assumes that all of these are perfectly mitigated. Thus, in [link] , the inner parts of the communication system are assumed to be ideal, except for the presenceof channel noise. Even so, most systems still fall far short of the optimal performance promised by Shannon.
There are two problems. First, most messages that people want to send are redundant, and the redundancy squandersthe capacity of the channel. A solution is to preprocess the messageso as to remove the redundancies. This is called source coding , and is discussed in "Source Coding" . For instance, as demonstrated in "Redundancy" , any natural language (such as English),whether spoken or written, is repetitive. Information theory (as Shannon's approach is called)quantifies the repetitiveness, and gives a way to judge the efficiency of a source codeby comparing the information content of the message to the number of bits required by the code.
The second problem is that messages must be resistant to noise. If a message arrives at the receiver in garbledform, then the system has failed. A solution is to preprocess the message by addingextra bits, which can be used to determine if an error has occurred, and to correct errors when they do occur.For example, one simple system would transmit each bit three times.Whenever a single bit error occurs in transmission, then the decoder at the receiver can figure out by a simplevoting rule that the error has occurred and what the bit should have been.Schemes for finding and removing errors are called error-correcting codes or channel codes , and are discussed in "Channel Coding" .
Notification Switch
Would you like to follow the 'Software receiver design' conversation and receive update notifications?