<< Chapter < Page Chapter >> Page >
Inroduces the concept of forward error correcting codes and in particular those based on block coding techniques. Includes partiy check equations and the parity check matrix.

Block fecc coding

Forward error correcting coding (fecc)

Block codes are one example of the forward error correcting coding (FECC) technique where we encode the signal by adding additional bits or digits of redundant data so that the decoder is then able to correct most of the errors which are introduced by transmission through a noisy channel. FECC was invented for deep space probes where the extremely long transmission propagation path loss results in received data with particularly low signal to noise ratio as the modest transmitter power is limited by the solar panel outputs.

As we are adding additional bits to generate each codeword this is a systematic encoder as the information data bits are included directly within the codewords. The additional bits required for the transmission of the redundant information increases the data rate which will consume more bandwidth if we wish to maintain the same throughput, but, if we seek to obtain low error rates, then this trade-off is usually acceptable.

Error probability against received noise level for FECC and uncoded data transmissions

[link] shows the typical error rate performance for uncoded data compared with FECC data. It plots the bit error probability, P b , against E b N 0 . E b N 0 is the measure of the energy per bit to the noise power spectral density ratio and is the normally used signal to noise ratio ratio measure on these error rate plots.

FECC is used widely in compact discs (CD), computer storage and data transmission, all manor of radio links, data modems, video, TV and cellphone transmissions, space communications etc. Note in [link] the ability of FECC to achieve a much lower error rate than for the uncoded data transmissions at low bit error probability, P b .

Ascii coding

In some computer communication systems, information is sent as 7-bit ASCII codes with a parity check bit added on the end. Using even parity the 7-bit all zero ASCII code 0000000 expands into 00000000 while 0000001 codes to 00000011. This (and all other cases) thus has a binary digit difference or Hamming Distance of 2. [link] shows that we can use this to detect 1 (or an odd number of) errors.

ASCII code example where received codeword has single error in the 5th bit position

The block length is then n = 8 and the number of information bits k = 7. This generally assists with error detection but is is insufficiently robust or redundant to achieve an error correction capability as the coding rate is only 7/8.

The minimum distance in binary digits between any two codewords is known as the minimum Hamming Distance, D min , which is 2 for the case of odd or even parity check in ASCII data transmission. We can then calculate the error detecting and correcting power of a code from the minimum distance in bits between error free blocks or codewords, see error correction capability module.

Although we shall look exclusively at coding schemes for binary systems, error correcting and detecting coding is not confined to binary digits. For example the ISBN numbers used on books have a checksum appended to them and these are calculated via modulo 11 arithmetic.

Block code construction

Block codes collect or arrange incoming information carrying data into groups of k binary digits and add coding (i.e. parity) bits to increase the coded block length up to n bits, where n>k.

Block coder with k information digits and appended parity check bits

The coding rate R is simply the ratio of data or information carrying bits to the overall block length, k/n. The number of parity check (redundant) bits is therefore n – k, [link] . This block code is usually described as a (n, k) code.

Block code example

Suppose we want to code k = 4 information bits into a n = 7 bit codeword, giving a coding rate of 4/7. Code design is performed by using finite field algebra to achieve linear codes. We can achieve this (7, 4) block code using 3 input exclusive or (EX - OR) gates to form the three even parity check bits, P 1 , P 2 and P 3 , from the 4 information carrying bits, I 1 ... I 4 , as shown in [link] .

Logic gate representation for (n, k) block coder where k = 4 information bits and n = 7 encoded block length (i.e. (7, 4) coder)

This circuitry can be represented by the logic gates in [link] or written either as a set of parity check equations or the corrresponding parity check matrix H, as in [link] .

Parity check bit computation and corresponding H matrix representation for (7, 4) block encoder

Remember here that the “cross-in-the-circle” symbol indicates a bitwise exclusive-or (EX – OR) operation. This H matrix can also be used to directly generate codewords from the information bits via a closely related G matrix.

This is an example of a systematic code, where the data is included directly within the codeword. Convolutional FECC, see later module, is an example of a non-systematic coder where we do not explicitly include the information carrying data within the transmissions, although the transmitted coded data is derived from the information data.

This module has been created from lecture notes originated by P M Grant and D G M Cruickshank which are published in I A Glover and P M Grant, "Digital Communications", Pearson Education, 2009, ISBN 978-0-273-71830-7. Powerpoint slides plus end of chapter problem examples/solutions are available for instructor use via password access at http://www.see.ed.ac.uk/~pmg/DIGICOMMS/

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Communications source and channel coding with examples. OpenStax CNX. May 07, 2009 Download for free at http://cnx.org/content/col10601/1.3
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Communications source and channel coding with examples' conversation and receive update notifications?

Ask