Book contents
- Frontmatter
- Contents
- Preface
- Introduction
- A word on notation
- List of symbols
- Part I The plane
- Part II Matrix structures
- Part III Here's to probability
- Part IV Information, error and belief
- 12 Entropy and coding
- 13 Information and error correction
- Part V Transforming the image
- Part VI See, edit, reconstruct
- References
- Index
13 - Information and error correction
from Part IV - Information, error and belief
Published online by Cambridge University Press: 05 November 2012
- Frontmatter
- Contents
- Preface
- Introduction
- A word on notation
- List of symbols
- Part I The plane
- Part II Matrix structures
- Part III Here's to probability
- Part IV Information, error and belief
- 12 Entropy and coding
- 13 Information and error correction
- Part V Transforming the image
- Part VI See, edit, reconstruct
- References
- Index
Summary
In the previous chapter we introduced Shannon's concept of the amount of information (entropy) conveyed by an unknown symbol as being the degree of our uncertainty about it. This was applied to encoding a message, or sequence of symbols, in the minimum number of bits, including image compression. The theory was ‘noiseless’ in that no account was taken of loss through distortion as information is conveyed from one site to another. Now we consider some ways in which information theory handles the problem of distortion, and its solution. (For the historical development, see Slepian, 1974, Sloane and Wyner, 1993, or Verdú and McLaughlin, 2000.)
Physically, the journey can be anything from microns along a computer ‘bus’, to kilometres through our planet's atmosphere, to a link across the Universe reaching a space probe or distant galaxy. In Shannon's model of a communication system, Figure 12.1, we think of the symbols reaching their destination via a ‘channel’, which mathematically is a distribution of conditional probabilities for what is received, given what was sent.
The model incorporates our assumptions about ‘noise’, which could be due to equipment which is faulty or used outside its specifications, atmospheric conditions, interference from other messages, and so on. Some possibilities are shown in Table 13.1.
We prove Shannon's (‘noisy’) Channel Coding Theorem, then review progress in finding practical error-correcting codes that approach the possibilites predicted by that theorem for successful transmission in the face of corruption by a noisy channel.
- Type
- Chapter
- Information
- Mathematics of Digital ImagesCreation, Compression, Restoration, Recognition, pp. 444 - 520Publisher: Cambridge University PressPrint publication year: 2006