Book contents
- Frontmatter
- Contents
- Preface
- Introduction
- A word on notation
- List of symbols
- Part I The plane
- Part II Matrix structures
- Part III Here's to probability
- Part IV Information, error and belief
- 12 Entropy and coding
- 13 Information and error correction
- Part V Transforming the image
- Part VI See, edit, reconstruct
- References
- Index
12 - Entropy and coding
from Part IV - Information, error and belief
Published online by Cambridge University Press: 05 November 2012
- Frontmatter
- Contents
- Preface
- Introduction
- A word on notation
- List of symbols
- Part I The plane
- Part II Matrix structures
- Part III Here's to probability
- Part IV Information, error and belief
- 12 Entropy and coding
- 13 Information and error correction
- Part V Transforming the image
- Part VI See, edit, reconstruct
- References
- Index
Summary
In this chapter we introduce the basic idea of entropy, quantifying an amount of information, and in its light we consider some important methods of encoding a sequence of symbols. We shall be thinking of these as text, but they also apply to a byte sequence representing pixel values of a digital image. In the next chapter we shall develop information theory to take account of noise, both visual and otherwise. Here we focus on ‘noiseless encoding’ in preparation for that later step. However, before leaving this chapter we take time to examine an alternative approach to quantifying information, which has resulted in the important idea of Minimum Description Length as a new principle in choosing hypotheses and models.
The idea of entropy
Shannon (1948), the acknowledged inventor of information theory, considered that a basis for his theory already existed in papers of Niquist (1924) and Hartley (1928). The latter had argued that the logarithm function was the most natural function for measuring information. For example, as Shannon notes, adding one relay to a group doubles the number of possible states, but adds one to the base 2 log of this number. Thus information might be measured as the number of bits, or binary digits bi = 0, 1, required to express an integer in binary form: bm … b1b0 = Σibi2i. For example, 34 = 100010 takes six bits. Shannon proposed a 5-component model of a communication system, reproduced in Figure 12.1.
- Type
- Chapter
- Information
- Mathematics of Digital ImagesCreation, Compression, Restoration, Recognition, pp. 395 - 443Publisher: Cambridge University PressPrint publication year: 2006