<< Chapter < Page Chapter >> Page >
The Huffman source coding algorithm is provably maximally efficient.

Shannon's Source Coding Theorem has additional applications in data compression . Here, we have a symbolic-valued signal source, like a computer file or an image, that we want torepresent with as few bits as possible. Compression schemes that assign symbols to bit sequences are known as lossless if they obey the Source Coding Theorem; they are lossy if they use fewer bits than the alphabet's entropy. Using a lossy compression scheme means that you cannotrecover a symbolic-valued signal from its compressed version without incurring some error. You might be wondering why anyonewould want to intentionally create errors, but lossy compression schemes are frequently used where the efficiency gained inrepresenting the signal outweighs the significance of the errors.

Shannon's Source Coding Theorem states that symbolic-valued signals require on the average at least H A number of bits to represent each of its values, which aresymbols drawn from the alphabet A . In the module on the Source Coding Theorem we find that using a so-called fixed rate source coder, one that produces a fixed number of bits/symbol, may not be the most efficient way of encodingsymbols into bits. What is not discussed there is a procedure for designing an efficient source coder: one guaranteed to produce the fewest bits/symbol on the average. That source coder is not unique,and one approach that does achieve that limit is the Huffman source coding algorithm .

In the early years of information theory, the race was on to be the first to find a provably maximally efficient source coding algorithm. The race was won by thenMIT graduate student David Huffman in 1954, who worked on the problem as a project in his information theory course. We'repretty sure he received an “A.”
  • Create a vertical table for the symbols, the best ordering being in decreasing order of probability.
  • Form a binary tree to the right of the table. A binary tree always has two branches at each node. Build the tree bymerging the two lowest probability symbols at each level, making the probability of the node equal to the sum of themerged nodes' probabilities. If more than two nodes/symbols share the lowest probability at a given level, pick any two;your choice won't affect B A .
  • At each node, label each of the emanating branches with a binary number. The bit sequence obtained from passingfrom the tree's root to the symbol is its Huffman code.

The simple four-symbol alphabet used in the Entropy and Source Coding modules has a four-symbol alphabet with the following probabilities, a 0 1 2 a 1 1 4 a 2 1 8 a 3 1 8 and an entropy of 1.75 bits . This alphabet has the Huffman coding tree shown in [link] .

Huffman coding tree

We form a Huffman code for a four-letter alphabet having the indicated probabilities of occurrence. The binary treecreated by the algorithm extends to the right, with the root node (the one at which the tree begins) defining thecodewords. The bit sequence obtained by traversing the tree from the root to the symbol defines that symbol's binarycode.

The code thus obtained is not unique as we could have labeled the branches coming out of each node differently. The averagenumber of bits required to represent this alphabet equals 1.75  bits, which is the Shannon entropy limit for this source alphabet. If we had thesymbolic-valued signal s m a 2 a 3 a 1 a 4 a 1 a 2 , our Huffman code would produce the bitstream b n 101100111010… .

If the alphabet probabilities were different, clearly a different tree, and therefore different code, could wellresult. Furthermore, we may not be able to achieve the entropy limit. If our symbols had the probabilities a 1 1 2 , a 2 1 4 , a 3 1 5 , and a 4 1 20 , the average number of bits/symbol resulting from the Huffman coding algorithm would equal 1.75  bits. However, the entropy limit is 1.68 bits. The Huffman code does satisfy the SourceCoding Theorem—its average length is within one bit of the alphabet's entropy—but you might wonder if a better codeexisted. David Huffman showed mathematically that no other code could achieve a shorter average code than his. We can'tdo better.

Got questions? Get instant answers now!

Derive the Huffman code for this second set of probabilities, and verify the claimed average code lengthand alphabet entropy.

The Huffman coding tree for the second set of probabilities is identical to that for the first ( [link] ). The average code length is 1 2 1 1 4 2 1 5 3 1 20 3 1.75 bits. The entropy calculation is straightforward: H A 1 2 1 2 1 4 1 4 1 5 1 5 1 20 1 20 , which equals 1.68 bits.

Got questions? Get instant answers now!

Questions & Answers

how does Neisseria cause meningitis
Nyibol Reply
what is microbiologist
Muhammad Reply
what is errata
Muhammad
is the branch of biology that deals with the study of microorganisms.
Ntefuni Reply
What is microbiology
Mercy Reply
studies of microbes
Louisiaste
when we takee the specimen which lumbar,spin,
Ziyad Reply
How bacteria create energy to survive?
Muhamad Reply
Bacteria doesn't produce energy they are dependent upon their substrate in case of lack of nutrients they are able to make spores which helps them to sustain in harsh environments
_Adnan
But not all bacteria make spores, l mean Eukaryotic cells have Mitochondria which acts as powerhouse for them, since bacteria don't have it, what is the substitution for it?
Muhamad
they make spores
Louisiaste
what is sporadic nd endemic, epidemic
Aminu Reply
the significance of food webs for disease transmission
Abreham
food webs brings about an infection as an individual depends on number of diseased foods or carriers dully.
Mark
explain assimilatory nitrate reduction
Esinniobiwa Reply
Assimilatory nitrate reduction is a process that occurs in some microorganisms, such as bacteria and archaea, in which nitrate (NO3-) is reduced to nitrite (NO2-), and then further reduced to ammonia (NH3).
Elkana
This process is called assimilatory nitrate reduction because the nitrogen that is produced is incorporated in the cells of microorganisms where it can be used in the synthesis of amino acids and other nitrogen products
Elkana
Examples of thermophilic organisms
Shu Reply
Give Examples of thermophilic organisms
Shu
advantages of normal Flora to the host
Micheal Reply
Prevent foreign microbes to the host
Abubakar
they provide healthier benefits to their hosts
ayesha
They are friends to host only when Host immune system is strong and become enemies when the host immune system is weakened . very bad relationship!
Mark
what is cell
faisal Reply
cell is the smallest unit of life
Fauziya
cell is the smallest unit of life
Akanni
ok
Innocent
cell is the structural and functional unit of life
Hasan
is the fundamental units of Life
Musa
what are emergency diseases
Micheal Reply
There are nothing like emergency disease but there are some common medical emergency which can occur simultaneously like Bleeding,heart attack,Breathing difficulties,severe pain heart stock.Hope you will get my point .Have a nice day ❣️
_Adnan
define infection ,prevention and control
Innocent
I think infection prevention and control is the avoidance of all things we do that gives out break of infections and promotion of health practices that promote life
Lubega
Heyy Lubega hussein where are u from?
_Adnan
en français
Adama
which site have a normal flora
ESTHER Reply
Many sites of the body have it Skin Nasal cavity Oral cavity Gastro intestinal tract
Safaa
skin
Asiina
skin,Oral,Nasal,GIt
Sadik
How can Commensal can Bacteria change into pathogen?
Sadik
How can Commensal Bacteria change into pathogen?
Sadik
all
Tesfaye
by fussion
Asiina
what are the advantages of normal Flora to the host
Micheal
what are the ways of control and prevention of nosocomial infection in the hospital
Micheal
what is inflammation
Shelly Reply
part of a tissue or an organ being wounded or bruised.
Wilfred
what term is used to name and classify microorganisms?
Micheal Reply
Binomial nomenclature
adeolu
Got questions? Join the online conversation and get instant answers!
Jobilize.com Reply

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Fundamentals of electrical engineering i. OpenStax CNX. Aug 06, 2008 Download for free at http://legacy.cnx.org/content/col10040/1.9
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Fundamentals of electrical engineering i' conversation and receive update notifications?

Ask