<< Chapter < Page
  Image coding   Page 1 / 1
Chapter >> Page >
This module introduces entropy of source information.

Entropy of source information was discussed in the third-year E5 Information and Coding course. For an image x , quantised to M levels, the entropy H x is defined as:

H x i 0 M 1 p i 2 logbase --> 1 p i i 0 M 1 p i 2 logbase --> p i
where p i , i 0 to M 1 , is the probability of the i th quantiser level being used (often obtained from a histogram of the pel intensities).

H x represents the mean number of bits per pel with which the quantised image x can be represented using an ideal variable-length entropy code. AHuffman code usually approximates this bit-rate quite closely.

To obtain the number of bits to code an image (or subimage) x containing N pels:

  • A histogram of x is measured using M bins corresponding to the M quantiser levels.
  • The M histogram counts are each divided by N to give the probabilities p i , which are then converted into entropies h i p i 2 logbase --> p i . This conversion law is illustrated in and shows that probabilities close to zero or one produce low entropy and intermediatevalues produce entropies near 0.5.
  • The entropies h i of the separate quantiser levels are summed to give the total entropy H x for the subimage.
  • Multiplying H x by N gives the estimated total number of bits needed to code x , assuming an ideal entropy code is available which is matched to the histogram of x .

Conversion from probability p i to entropy h i p i 2 logbase --> p i .

shows the probabilities p i and entropies h i for the original Lenna image and shows these for each of the subimages in this previous figure , assuming a uniform quantiser with a step-size Q step 15 in each case. The original Lenna image contained pel values from 3 to 238 and a mean level of 120 was subtracted fromeach pel value before the image was analysed or transformed in order that all samples would be approximately evenly distributedabout zero (a natural feature of highpass subimages).

Probability histogram (dashed) and entropies (solid) of the Lenna image in ( original image ).
Probability histogram (dashed) and entropies (solid) of the four subimages of the Level 1 Haar transform of Lenna (see previous figure ).

The Haar transform preserves energy and so the expected distortion energy from quantising the transformed image y with a given step size Q step will be approximately the same as that from quantising the input image x with the same step size. This is because quantising errors can usually bemodeled as independent random processes with variance (energy) = Q step 2 12 and the total squared quantising error (distortion) will tend to the sum of the variances over all pels. Thisapplies whether the error energies are summed before or after the inverse transform (reconstruction) in the decoder.

Hence equal quantiser step sizes before and after an energy-preserving transformation should generate equivalentquantising distortions and provide a fair estimate of the compression achieved by the transformation.

The first two columns of (original and level 1) compare the entropy (mean bit rate) per pel for the original image (3.71 bit / pel) with that of theHaar transformed image of this previous figure (2.08 bit / pel), using Q step 15 . Notice that the entropy of the original image is almost as great as the 4 bit / pel that would be needed to codethe 16 levels using a simple fixed-length code, because the histogram is relatively uniform.

The level 1 column of shows the contribution of each of the subimages of this previous figure to the total entropy per pel (the entropies from have been divided by 4 since each subimage has one quarter of the total number of pels). the Lo-Lo subimagecontributes 56% to the total entropy (bit rate) and has similar spatial correlations to the original image. Hence it is alogical step to apply the Haar transform again to this subimage.

Mean bit rate for the original Lenna image and for the Haar transforms of the image after 1 to 4 levels, using a quantiserstep size Q step 15 .

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Image coding. OpenStax CNX. Jan 22, 2004 Download for free at http://cnx.org/content/col10206/1.3
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Image coding' conversation and receive update notifications?

Ask