<< Chapter < Page Chapter >> Page >
Shannon showed the power of probabilistic models for symbolic-valued signals. The dey quantity that characterizes such a signal is the entropy of its alphabet.

Communication theory has been formulated best for symbolic-valued signals. ClaudeShannon published in 1948 The Mathematical Theory of Communication , which became the cornerstone of digital communication. He showed the power of probabilistic models for symbolic-valued signals, which allowed him to quantify the information present in a signal. In the simplestsignal model, each symbol can occur at index n with a probability a k , k 1 K . What this model says is that for each signal value a K -sided coin is flipped (note that the coin need not be fair). For this model to make sense, theprobabilities must be numbers between zero and one and must sum to one.

0 a k 1
k 1 K a k 1
This coin-flipping model assumes that symbols occur without regard to what preceding or succeeding symbols were, a falseassumption for typed text. Despite this probabilistic model's over-simplicity, the ideas we develop here alsowork when more accurate, but still probabilistic, models are used. The key quantity that characterizes a symbolic-valuedsignal is the entropy of its alphabet.
H A k k a k 2 logbase --> a k
Because we use the base-2 logarithm, entropy has units of bits. For this definition to make sense, we must take specialnote of symbols having probability zero of occurring. A zero-probability symbol never occurs; thus, we define 0 2 logbase --> 0 0 so that such symbols do not affect the entropy. The maximum value attainable by an alphabet's entropy occurswhen the symbols are equally likely ( a k a l ). In this case, the entropy equals 2 logbase --> K . The minimum value occurs when only one symbol occurs; it has probability one of occurring and the rest haveprobability zero.

Derive the maximum-entropy results, both the numeric aspect (entropy equals 2 logbase --> K ) and the theoretical one (equally likely symbols maximize entropy). Derive the value of the minimum entropyalphabet.

Equally likely symbols each have a probability of 1 K . Thus, H A k k 1 K 2 logbase --> 1 K 2 logbase --> K . To prove that this is the maximum-entropy probability assignment, we must explicitly take into accountthat probabilities sum to one. Focus on a particular symbol, say the first. a 0 appears twice in the entropy formula: the terms a 0 2 logbase --> a 0 and 1 a 0 a K -2 2 logbase --> 1 a 0 a K -2 . The derivative with respect to this probability (and all the others) must be zero. The derivative equals 2 logbase --> a 0 2 logbase --> 1 a 0 a K -2 , and all other derivatives have the same form (just substitute your letter's index). Thus, eachprobability must equal the others, and we are done. For the minimum entropy answer, one term is 1 2 logbase --> 1 0 , and the others are 0 2 logbase --> 0 , which we define to be zero also. The minimum value of entropy is zero.

Got questions? Get instant answers now!

A four-symbol alphabet has the following probabilities. a 0 1 2 a 1 1 4 a 2 1 8 a 3 1 8 Note that these probabilities sum to one as they should. As 1 2 2 , 2 logbase --> 1 2 -1 . The entropy of this alphabet equals

H A 1 2 2 logbase --> 1 2 1 4 2 logbase --> 1 4 1 8 2 logbase --> 1 8 1 8 2 logbase --> 1 8 1 2 -1 1 4 -2 1 8 -3 1 8 -3 1.75 bits

Got questions? Get instant answers now!

Questions & Answers

summarize halerambos & holbon
David Reply
the Three stages of Auguste Comte
Clementina Reply
what are agents of socialization
Antonio Reply
sociology of education
Nuhu Reply
definition of sociology of education
Nuhu
what is culture
Abdulrahim Reply
shared beliefs, values, and practices
AI-Robot
What are the two type of scientific method
ogunniran Reply
I'm willing to join you
Aceng Reply
what are the scientific method of sociology
Man
what is socialization
ogunniran Reply
the process wherein people come to understand societal norms and expectations, to accept society's beliefs, and to be aware of societal values
AI-Robot
scientific method in doing research
ogunniran
defimition of sickness in afica
Anita
Cosmology
ogunniran
Hmmm
ogunniran
list and explain the terms that found in society
REMMY Reply
list and explain the terms that found in society
Mukhtar
what are the agents of socialization
Antonio
Family Peer group Institution
Abdulwajud
I mean the definition
Antonio
ways of perceived deviance indifferent society
Naomi Reply
reasons of joining groups
SAM
to bring development to the nation at large
Hyellafiya
entails of consultative and consensus building from others
Gadama
World first Sociologist?
Abu
What is evolutionary model
Muhammad Reply
Evolution models refer to mathematical and computational representations of the processes involved in biological evolution. These models aim to simulate and understand how species change over time through mechanisms such as natural selection, genetic drift, and mutation. Evolutionary models can be u
faruk
what are the modern trends in religious behaviours
Selekeye Reply
what are social norms
Daniel Reply
shared standards of acceptable behavior by the group or appropriate behavior in a particular institution or those behaviors that are acceptable in a society
Lucius
that is how i understood it
Lucius
examples of societal norms
Diamond
Discuss the characteristics of the research located within positivist and the interpretivist paradigm
Tariro Reply
what is Industrialisation
Selekeye Reply
industrialization
Angelo
Got questions? Join the online conversation and get instant answers!
Jobilize.com Reply

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Fundamentals of electrical engineering i. OpenStax CNX. Aug 06, 2008 Download for free at http://legacy.cnx.org/content/col10040/1.9
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Fundamentals of electrical engineering i' conversation and receive update notifications?

Ask